Elastic Now Collaborates with AWS to Leverage GenAI Capabilities

Elastic is strengthening its relationship with Amazon Web Services (AWS) by leveraging the latest generative artificial intelligence (AI) services from AWS. As part of this collaboration, Elastic is offering large language model (LLM) observability support for Amazon Bedrock in Elastic Observability. Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies via a single API, along with a broad set of capabilities organisations need to build generative AI applications with security, privacy, and responsible AI.

The new integration offers Site Reliability Engineers (SREs) detailed insights into their Amazon Bedrock LLMs performance and usage. SREs can now leverage Elastic Observability to monitor invocations, errors, and latency metrics. This allows them to more proactively prevent incidents and identify root causes, ensuring optimal performance for their Amazon Bedrock-powered generative AI applications. Additionally, Elastic AI Assistant, which utilises Amazon Bedrock, helps SREs accurately analyse data, generate visualisations, and provides actionable recommendations for issue resolution.

As LLM-based applications are growing, it’s essential for developers and SREs to be able to monitor, optimise, and troubleshoot how they perform,” said Santosh Krishnan, general manager of Security and Observability Solutions at Elastic. “Today’s integration simplifies the collection of metrics and logs from Amazon Bedrock, in turn streamlining the process of gaining valuable and actionable insights.”

Related posts

Data Security Council of India launch ‘Cyber for HER’ Hackathon

Red Hat Ansible Automation Platform Service on AWS Now Available

NetApp announces integrated solution with AWS outposts for hybrid cloud deployments

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Read More