The Elastic Stack — known as ELK (Elasticsearch, Logstash, and Kibana) — is a powerful open-source platform for analyzing data. It offers a comprehensive set of features for indexing, searching, monitoring, and analyzing data in real-time.
The Elastic Stack can be deployed on-premises or in the cloud. It’s used by startups and large enterprises alike, including Netflix, Facebook, Goldman Sachs, The Guardian, and Thomson Reuters.
The Elastic Stack is made up of three core components:
- Elasticsearch: A search and analytics engine that indexes data quickly and provides fast search results.
- Logstash: A log pipeline tool that collects, parses, and stores logs from multiple sources.
- Kibana: A data visualization and analytics tool that enables you to search, view, analyze and share data.
Each of these components offers unique features and benefits.
Elasticsearch is a fast, scalable, and easy-to-use search and analytics engine that indexes data quickly and provides fast search results. It’s perfect for real-time applications such as website search, product catalogs, customer support, and security event analysis.
Logstash is a log pipeline tool that collects, parses, and stores logs from multiple sources. It helps you to get your data where it needs to go, when it needs to go there. Logstash can also be used to monitor your system in near real-time, so you can identify and fix problems before they become serious.
Kibana is a data visualization and analytics tool that enables you to search, view, analyze and share data. With Kibana, you can quickly find insights in your data and see how your business is performing. Kibana makes it easy to create custom dashboards, so you can track the metrics that matter most to you.
Elastic Stack is one of the most popular big data solutions, and for a good reason. It’s easy to set up, it’s scalable, and it offers a wealth of features for indexing, searching, monitoring, and analyzing data.
If you’re looking for a powerful open-source platform for analyzing data, the Elastic Stack is worth considering.
ELK Stack is designed to allow developers to quickly get data into a searchable and analyzable state, without the need for an army of consultants. Users can take data from any source and use Kibana and/or Logstash to visualize the data in a time series.
Elasticsearch is a distributed, RESTful search and analytics engine that indexes data quickly, providing fast search results. It’s perfect for real-time applications such as website search, product catalogs, customer support and security event analysis.
In this tutorial, we will walk you through the steps to install Elastic Stack on Ubuntu 20.04 server.
- Minimal installation of Ubuntu Server 20.04 LTS with SSH access and a non-root user with sudo privileges.
- Minimum of 4 GB RAM, 2 CPUs, and 20 GB hard drive space available.
Updating the System
It’s a good practice to keep your server packages up-to-date. Run the following command to update your repository cache:
sudo apt update && sudo apt upgrade -y
Once the update is complete, run the command below to install required dependencies.
sudo apt install wget apt-transport-https curl gnupg2 -y
Elasticsearch requires Java to run. Ubuntu 2.04 comes with Java 8 and Java 11, but some plugins may not be compatible with Java 11, so we will install Java 8.
Run the following command to install Java 8 on your system.
sudo apt install openjdk-8-jdk -y
Once the installation is complete, you can check the version of Java installed on your system by running the following command.
The output should look something like this.
The default Ubuntu 20.04 repository does not contain Elasticsearch, but it can be easily installed using APT after adding the official Elastic repository from the Elasticsearch developer team.
First, download the public signing key (Elasticsearch does this automatically) using the curl command to validate packages. Packages are signed with this key to ensure their authenticity. We’ll be using the -fsSL arguments to indicate to curl that we want to silence all progress and enable cURL to make a request to the specified URL.
curl -fsSL https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
You will get an OK message if the key was successfully added.
Next, add the Elasticsearch repository to your system by running the following command. We will add elastic-7.x.list to the sources.list.d directory. The sources.list.d directory is a special directory in Ubuntu where you can add additional sources for software. The APT system uses these files to find available packages.
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list
Then, update your repository cache to read the new Elasticsearch repository.
sudo apt update
Now you can install Elasticsearch by running the following command.
sudo apt install elasticsearch -y
Once the installation is complete, we will need to configure Elasticsearch. The /etc/elasticsearch/elasticsearch.yml file is the main configuration file for all Elasticsearch settings. Such as the network.host parameter defines on which IP address or hostname to bind the Elasticsearch service.
Let’s open the file using the nano text editor.
sudo nano /etc/elasticsearch/elasticsearch.yml
Most of these settings are pre-configured but you can fine-tune them for your needs. In this example, we only change the network host setting. We will uncomment and change the host: to localhost so that Elasticsearch can listen on any interface. If you want to use a specific interface, you can replace localhost with the specific IP address or hostname.
Once you are done with the changes, save the file by pressing CTRL+X, Y, and Enter. Now start Elasticsearch using the following command. The start-up process can take a few minutes so be patient.
sudo systemctl start elasticsearch
To enable Elasticsearch to automatically start on boot, run the following commands.
sudo systemctl enable elasticsearch
Now you can test the installation by running the following command.
curl -X GET "localhost:9200"
You should see the following response from your local node.
Now that we have Elasticsearch up and running, we can install Kibana. Kibana is a visual interface for Elasticsearch that allows us to search and visualize our data.
You should always install the Kibana Dashboard right after installing the Elasticsearch server. This will ensure that all the components and settings of the Elastic Stack are correctly in place.
Run the following command to install Kibana.
sudo apt install kibana -y
Once the installation is complete, we will need to configure Kibana. The /etc/kibana/kibana.yml file is the main configuration file for all Kibana settings. Such as the port where to bind the Kibana service, elasticsearch.url setting defines on which Elasticsearch URL should kibana be pointing to, and other security settings.
Let’s open this file using the nano text editor.
sudo nano /etc/kibana/kibana.yml
In this example, we will uncomment these lines: server.port, server.host, and elasticsearch.hosts so that Kibana points to the correct Elasticsearch URL.
Now save the file by pressing CTRL+X, Y, and Enter. Now start and enable Kibana using the following command.
sudo systemctl start kibana && sudo systemctl enable kibana
Now open your favorite web browser and go to the following URL.
You will be redirected to the Kibana welcome screen. From here, you can start using Kibana. If you get a “Kibana server not ready yet” error, give it a few minutes to start up. Or make sure that both Elasticsearch and Kibana are up and running.
Filebeat is a lightweight log shipper that can send data to Elasticsearch. You can install Filebeat using the following command.
sudo apt install filebeat -y
Once the installation is complete, you will need to configure Filebeat. The /etc/filebeat/filebeat.yml file is the main configuration file for all Filebeat settings.
Let’s open this file using the nano text editor.
sudo nano /etc/filebeat/filebeat.yml
Now, uncomment the output.logstash line and the host line as shown below.
Now save the file by pressing CTRL+X, Y, and Enter. Now let’s enable the Filebeat system module, load the index template, and connect Filebeat to Elasticsearch.
sudo filebeat modules enable system && sudo filebeat setup --index-management -E output.logstash.enabled=false -E 'output.elasticsearch.hosts=["localhost:9200"]'
sudo filebeat setup -E output.logstash.enabled=false -E output.elasticsearch.hosts=['localhost:9200'] -E setup.kibana.host=localhost:5601
Next, run the following command to start and enable Filebeat.
sudo systemctl start filebeat && sudo systemctl enable filebeat
Finally, run the following command to verify that Elasticsearch is receiving data from Filebeat,
curl -XGET 'http://localhost:9200/filebeat-*/_search?pretty'
You should see the same data as what is shown in the image below. This output confirms that Filebeat is shipping data to Elasticsearch.
Now that you have successfully setup Elasticsearch and Kibana, the last step is to install Logstash.
You can install Logstash using the following command.
sudo apt install logstash -y
Once the installation is complete, you can start and enable Logstash using the following command.
sudo systemctl start logstash && sudo systemctl enable logstash
To check if the Logstash is up and running, you can run the following command to check its status.
sudo systemctl status logstash
You will get the following output.
Now that we have Logstash installed, you can configure it to your needs. You can refer to the Logstash documentation for more information.
Once you are done configuring Logstash, return to the Kibana web interface that you have opened in your web browser in the previous step. From there, you can start managing and visualizing your data. Congratulations! You have now successfully installed the Elastic Stack on your Ubuntu 20.04 server.
In this tutorial, you have learned how to install Elasticsearch, Kibana, and Logstash on an Ubuntu 20.04 server. You also learned how to configure all three components in order for them to communicate with each other.
Leave your comments, suggestions, and questions below. If this article helped you in any way, please hit the share buttons at the top of this page to help them.