I have demonstrated how to run elasticsearch (also kibana & logstash) using docker in ELK Stack Local Development using Docker. So basically we just do the same way using Laradock, but I will give more in-depth in this post.
Prerequisites
You have ever run any laradock service/container before. If not then start here.
Run the elasticsearch container
Go to your laradock directory using CLI and execute:
docker-compose up -d elasticsearch
When you check the container status using docker-compose ps
, I hope the result is similar to this:
Name Command State Ports
-------------------------------------------------------------------------------------------------------------------------------------------
laradock_docker-in-docker_1 dockerd-entrypoint.sh Up 2375/tcp, 2376/tcp
laradock_elasticsearch_1 /usr/local/bin/docker-entr ... Up 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp
laradock_php-fpm_1 docker-php-entrypoint php-fpm Up 9000/tcp
laradock_workspace_1 /sbin/my_init Up 0.0.0.0:2222->22/tcp, 0.0.0.0:8001->8000/tcp, 0.0.0.0:8080->8080/tcp
And if you check the resource used by the container using docker stats
, you probably have the same result as mine:
CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS
e4ff663e300c laradock_elasticsearch_1 0.34% 783.6MiB / 3.846GiB 19.90% 2.15kB / 0B 241MB / 422kB 55
33f5234ad041 laradock_php-fpm_1 0.00% 9.578MiB / 3.846GiB 0.24% 1.01kB / 0B 13.1MB / 0B 3
2d14b2b770aa laradock_workspace_1 0.00% 8.832MiB / 3.846GiB 0.22% 2.81kB / 0B 778kB / 57.3kB 6
94caba28a367 laradock_docker-in-docker_1 0.37% 20.11MiB / 3.846GiB 0.51% 1.8kB / 0B 22.2MB / 1.01MB 22
I use 4GB memory allocation for my Linux-based virtual machine in my Docker Desktop (Windows). Elasticsearch is taking 20% of 3.84GB available memory. The amount of resource is taken in large by the container, imagine if you also run kibana
and logstash
that has the same amount, the total resources taken will be 60%. So please consider the allocation in your machine or your production server in the future.
Accessing the Elasticsearch
If you accessing the service from the outside laradock environment, such as Postman Desktop application or any PHP application configured outside laradock (e.g PHP application using WAMP), the full address is http://localhost:9200. Here is the result using postman:
If you accessing the service inside the laradock environment, such as using curl
in workspace
container, the full address is elasticsearch:9200
. Here is the result using curl inside workspace container:
laradock@2d14b2b770aa:/var/www$ curl elasticsearch:9200
{
"name" : "laradock-node",
"cluster_name" : "laradock-cluster",
"cluster_uuid" : "3juGQWtJTa2L7yYBKnfh-g",
"version" : {
"number" : "7.5.1",
"build_flavor" : "default",
"build_type" : "docker",
"build_hash" : "3ae9ac9a93c95bd0cdc054951cf95d88e1e18d96",
"build_date" : "2019-12-16T22:57:37.835892Z",
"build_snapshot" : false,
"lucene_version" : "8.3.0",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}
laradock@2d14b2b770aa:/var/www$
Working with Laravel and Elasticsearch
I will find the available package for wrapping the elasticsearch and laravel that actually works and do the demo here.
Have fun experimenting with elasticsearch in laradock!
versions used:
- laradock ELK_VERSION=7.5.1
- Postman Desktop: Postman v7.21.1
Top comments (5)
How use logstash in laradock to syn data from sqlserver?
Dockerfile
ARG ELK_VERSION=7.5.1
FROM docker.elastic.co/logstash/logstash:${ELK_VERSION}
USER root
RUN rm -f /usr/share/logstash/pipeline/logstash.conf
RUN curl -L -o /usr/share/logstash/lib/mysql-connector-java-5.1.47.jar repo1.maven.org/maven2/mysql/mysql...
RUN curl -L -o /usr/share/logstash/lib/mssql-jdbc-9.2.1.jre11.jar github.com/microsoft/mssql-jdbc/re...
ADD ./pipeline/ /usr/share/logstash/pipeline/
ADD ./config/ /usr/share/logstash/config/
RUN logstash-plugin install logstash-input-jdbc
RUN logstash-plugin install logstash-input-beats
pipeline.conf
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/lib/mssql-jdbc-9.2.1.jre11.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://servername:1433;databaseName=DatabaseName;"
jdbc_user => "user"
jdbc_password => "password"
statement => "SELECT * FROM DatabaseName..users"
schedule => "1 * * * *"
}
}
filter {
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
index => "users"
document_type => "user"
}
stdout { codec => rubydebug }
}
I haven't tried logstash
In postman, you use Basic Auth and set username & password. I don't know where are they come from?
I use basic auth, see the first image in this post.