DEV Community

Cover image for OpenTelemetry with Elastic Observability
Rahul Ranjan
Rahul Ranjan

Posted on • Edited on

OpenTelemetry with Elastic Observability

OpenTelemetry is an open-source framework for observability that, when combined with Elastic Observability, provides powerful insights into distributed systems. It enables organizations to efficiently monitor, troubleshoot, and optimize their applications. In this article, we will provide you with a detailed guide on how to set up an OpenTelemetry demo with Elastic Observability. We will cover essential steps, configurations, and best practices that will help you leverage the full potential of observability in your environment.

Understanding OpenTelemetry and Elastic Observability:

OpenTelemetry:
OpenTelemetry is a project under the Cloud Native Computing Foundation (CNCF) that aims to provide a cohesive approach to instrument, generate, collect, and export telemetry data, which includes metrics, traces, and logs, from software applications. It offers libraries for instrumenting code in various programming languages and provides standardized APIs to capture telemetry data from different components of distributed systems.

Elastic Observability:
Elastic Observability is a complete solution for observability provided by Elastic. It provides integrated tools for monitoring, logging, and tracing distributed applications. The solution includes Elastic APM (Application Performance Monitoring), Elastic Logs, and Elastic Metrics, all of which are seamlessly integrated within the Elastic Stack.

Setup Details:
This doc will cover both How to set up the OpenTelemetry demo with Elastic Observability using Docker compose or Kubernetes.

Download the source code of the application from GitHub Repo.
Download your application on a Kubernetes cluster in your cloud service of choice or local Kubernetes platform. First, clone the directory locally. Make sure you have kubectl and helm also installed locally:

git clone https://github.com/elastic/opentelemetry-demo.git
Enter fullscreen mode Exit fullscreen mode
OTEL_EXPORTER_OTLP_ENDPOINT is Elastic's APM Server
OTEL_EXPORTER_OTLP_HEADERS Elastic Authorization
Enter fullscreen mode Exit fullscreen mode

Under Integrations->APM in your Elastic cloud, find these values in OpenTelemetry setup instructions.

Docker compose
Start a free trial on Elastic Cloud and copy the endpoint and secretToken from the Elastic APM setup instructions in your Kibana.

Open the file src/otelcollector/otelcol-elastic-config-extras.yml in an editor and replace the following two placeholders:

YOUR_APM_ENDPOINT_WITHOUT_HTTPS_PREFIX: Your Elastic APM endpoint (without https:// prefix) that must also include the port (for example: 987654.xyz.com:443).
YOUR_APM_SECRET_TOKEN: your Elastic APM secret token.
The updated file should look like the below(Make sure to note the actual format for the secret token including the Bearer keyword)

exporters:
  otlp/elastic:
    # !!! Elastic APM https endpoint WITHOUT the "https://" prefix
    endpoint: "11111111111.apm.xyz.xyz.cloud.es.io:443"
    compression: none
    headers:
      Authorization: "Bearer aaaaaaaaaaaaaaa"
Enter fullscreen mode Exit fullscreen mode
exporters:
  otlp/elastic:
    # !!! Elastic APM https endpoint WITHOUT the "https://" prefix
    endpoint: "11111111111.apm.xyz.xyz.cloud.es.io:443"
    compression: none
    headers:
      Authorization: "Bearer aaaaaaaaaaaaaaa"

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [spanmetrics, otlp/elastic]
    metrics:
      receivers: [otlp, spanmetrics]
      processors: [batch]
      exporters: [otlp/elastic]
    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp/elastic]
Enter fullscreen mode Exit fullscreen mode

The updated file should look something like the above.

Optional: If you are using a shared Elastic Cluster, set the environment name so you can distinguish your data. Go to the (project root)/.env file , and locate the _OTEL_RESOURCE_ATTRIBUTES _variable, and add your environment name like below.

OTEL_RESOURCE_ATTRIBUTES="service.namespace=opentelemetry-demo,deployment.environment=<Any Name>"
Enter fullscreen mode Exit fullscreen mode

Start the demo with the below command from the repository’s root directory:

docker-compose up -d
Enter fullscreen mode Exit fullscreen mode

Verify that the application is running correctly by checking the links below and testing their functionality.

Web store: http://localhost:8080/
Grafana: http://localhost:8080/grafana/
Load Generator UI: http://localhost:8080/loadgen/
Jaeger UI: http://localhost:8080/jaeger/ui/
Enter fullscreen mode Exit fullscreen mode

To stop the application, run the following from the project root directory:

docker-compose down
Enter fullscreen mode Exit fullscreen mode

Kubernetes

Create a Kubernetes cluster. Set up Kubectl and Helm.
Set up Elastic Observability on Elastic Cloud.

Create a secret in Kubernetes with the following command.

kubectl create secret generic elastic-secret \
  --from-literal=elastic_apm_endpoint='YOUR_APM_ENDPOINT_WITHOUT_HTTPS_PREFIX' \
  --from-literal=elastic_apm_secret_token='YOUR_APM_SECRET_TOKEN'

Enter fullscreen mode Exit fullscreen mode

Execute the following commands to deploy the OpenTelemetry demo to your Kubernetes cluster.

# switch to the kubernetes/elastic-helm directory
cd kubernetes/elastic-helm

# Add the open-telemetry Helm repostiroy
helm repo add open-telemetry https://open-telemetry.github.io/opentelemetry-helm-charts

#Update Repo
helm repo update open-telemetry

# deploy the demo through helm install
helm install -f values.yaml my-otel-demo open-telemetry/opentelemetry-demo
Enter fullscreen mode Exit fullscreen mode

Once your application is up on Kubernetes, validate that all the pods are running in the default namespace.

kubectl get pods -n default
Enter fullscreen mode Exit fullscreen mode

Verify that the application is running correctly by checking the links below and testing their functionality.

Web store: http://localhost:8080/
Grafana: http://localhost:8080/grafana/
Load Generator UI: http://localhost:8080/loadgen/
Jaeger UI: http://localhost:8080/jaeger/ui/
Enter fullscreen mode Exit fullscreen mode

Kubernetes Monitoring:

This demo includes cluster-level metrics and Kubernetes events collection. To enable Node-level metrics collection and autodiscovery for Redis Pods, run an additional Otel collector Daemonset.

helm install daemonset open-telemetry/opentelemetry-collector --values daemonset.yaml
Enter fullscreen mode Exit fullscreen mode

Explore and analyze the data With Elastic

View your OTel instrumented services in Kibana's APM Service Map. To access, go to APM in Elastic Observability UI and select servicemap.

Image description
Image description
Image description

If you are seeing this, it means that data is being sent to the Elastic cluster by the OpenTelemetry Collector. You can now explore the data and experiment with it.

To get a comprehensive understanding of all the services and transaction flows between them, you can refer to the APM service map (as demonstrated in the previous step). Additionally, you have the option to examine individual services and the collected transactions.

Image description
Image description

As you can see, the loadgenerator details are listed:

  • Average service latency
  • Throughput
  • Main transactions
  • Failed traction rate
  • Errors
  • Dependencies

Now Click on Transactions → GET(or any request) and we can see the full trace with all the spans. You can further explore and analyze data, examining it with minute detail.

Elastic utilizes machine learning to identify potential latency issues across services by analyzing the trace. Users can easily access the Latency Correlations tab and run the correlation.

Image description

Troubleshoot OTEL data ingest issue:

If you are unable to see any data in your Environment view, start by checking the logs of the OTEL collector with the following command:

docker-compose logs -f otelcol
Enter fullscreen mode Exit fullscreen mode

If you come across any errors related to the connection, please double-check the URL and token configuration.
If you are unable to see any services running, try running a full restart of the docker compose and see if that helps.

Analyze
Analyze your data with Elastic machine learning (ML)
After integrating OpenTelemetry metrics with Elastic, you can begin to analyze your data using Elastic's machine-learning capabilities.

References:

https://www.linkedin.com/pulse/opentelemetry-elastic-observability-rahul-ranjan-3l6bc/

https://medium.com/@rahul.fiem/opentelemetry-demo-with-elastic-observability-45d938a65ef8

https://rranjan.hashnode.dev/opentelemetry-with-elastic-observability

Repo: https://github.com/elastic/opentelemetry-demo

Top comments (0)