Oct 2025
27 Mon
28 Tue
29 Wed
30 Thu
31 Fri
1 Sat
2 Sun
Nov 2025
3 Mon
4 Tue
5 Wed
6 Thu
7 Fri
8 Sat 02:45 PM – 05:30 PM IST
9 Sun
Vivek Pemawat
Submitted Sep 29, 2025
Session Description
In this session, we will explore how to design a Test Observability Platform that provides deep visibility into tests, builds, and failures across the CI/CD pipeline. The platform combines data from logs, traces, and test reports to deliver actionable insights, enabling teams to detect, analyze, and resolve failures much faster. We will discuss how selective builds, coverage-based test selection, and shift-left vulnerability scanning fit into this platform to improve developer feedback and overall quality.
The session will also cover how AI/ML can be applied to observability data for smarter triage and root cause analysis (RCA). By correlating signals from OpenSearch (logs), Jaeger (traces), and ReportPortal (test outcomes), the platform can classify failures, detect anomalies, and suggest probable root causes. Attendees will learn how to architect such a system, the trade-offs involved, and how it drives efficiency for engineering and QA teams.
Key Takeaways
How to design a Test Observability Platform that unifies build optimization, test execution, and RCA.
How to apply AI/ML on observability data (logs, traces, reports) for faster and more accurate failure analysis.
Target Audience
QA and Test Engineers who want deeper visibility into test executions.
DevOps/SRE teams working on scalable and reliable CI/CD pipelines.
Architects and Engineering Managers looking to shift-left testing and observability in their organizations.
Bio
I am a Principal Engineer at Acceldata, driving developer productivity through scalable CI/CD, observability, and automation. My work includes building AI-powered agents for build failure analysis, designing reliable release and test infrastructures, and integrating DevSecOps practices across pipelines. I focus on microservices optimization, distributed tracing with Jaeger, and centralized log management with OpenSearch. Additionally, I create engineering dashboards for real-time visibility into builds, deployments, and test metrics. My goal is to enable faster feedback cycles, higher code quality, and secure, resilient delivery pipelines.
Hosted by
Supported by
Meet-up sponsor
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}