DevOps Engineer (Elasticsearch Experience) Job at Infinitive Inc, Falls Church, VA

ZjZKSi9mSG1DS05iRy9vcmtSdDEwTE5JK3c9PQ==
  • Infinitive Inc
  • Falls Church, VA

Job Description

Job Description

Job Description

About Infinitive:
Infinitive is a data and AI consultancy that enables its clients to modernize, monetize and operationalize their data to create lasting and substantial value. We possess deep industry and technology expertise to drive and sustain adoption of new capabilities. We match our people and personalities to our clients' culture while bringing the right mix of talent and skills to enable high return on investment.

Infinitive has been named “Best Small Firms to Work For” by Consulting Magazine 7 times most recently in 2024. Infinitive has also been named a Washington Post “Top Workplace”, Washington Business Journal “Best Places to Work”, and Virginia Business “Best Places to Work.”

 

About the Role:
We are seeking a skilled DevOps Engineer with data engineering experience to join our dynamic team. The ideal candidate will have expertise in ElasticSearch, CI/CD, Git, and Infrastructure as Code (IaC) while also possessing experience in data engineering. You will be responsible for designing, automating, and optimizing infrastructure, deployment pipelines, and data workflows. This role requires close collaboration with data engineers, software developers, and operations teams to build scalable, secure, and high-performance data platforms.
Key Responsibilities: DevOps & Infrastructure Management:
  • Design, deploy, and manage ElasticSearch clusters, ensuring high availability, scalability, and performance for search and analytics workloads.
  • Develop and maintain CI/CD pipelines for automating build, test, and deployment processes using tools like Jenkins, GitHub Actions, GitLab CI/CD, or ArgoCD.
  • Manage and optimize version control workflows using Git, ensuring best practices for branching, merging, and release management.
  • Implement Infrastructure as Code (IaC) solutions using Terraform, CloudFormation, or Ansible for cloud and on-prem infrastructure.
  • Automate system monitoring, alerting, and incident response using tools such as Prometheus, Grafana, Elastic Stack (ELK), or Datadog.
Data Engineering & Pipeline Automation:
  • Collaborate with data engineering teams to design and deploy scalable ETL/ELT pipelines using Apache Kafka, Apache Spark, Kinesis, Pub/Sub, Dataflow, Dataproc, or AWS Glue.
  • Optimize data storage and retrieval for large-scale analytics and search workloads using ElasticSearch, BigQuery, Snowflake, Redshift, or ClickHouse.
  • Ensure data pipeline reliability and performance, implementing monitoring, logging, and alerting for data workflows.
  • Automate data workflows and infrastructure scaling for high-throughput real-time and batch processing environments.
  • Implement data security best practices, including access controls, encryption, and compliance with industry standards such as GDPR, HIPAA, or SOC 2.
Required Skills & Qualifications:
  • 3+ years of experience in DevOps, Data Engineering, or Infrastructure Engineering.
  • Strong expertise in ElasticSearch, including cluster tuning, indexing strategies, and scaling.
  • Hands-on experience with CI/CD pipelines using Jenkins, GitHub Actions, GitLab CI/CD, or ArgoCD.
  • Proficiency in Git for version control, branching strategies, and code collaboration.
  • Experience with Infrastructure as Code (IaC) using Terraform, CloudFormation, Ansible, or Pulumi.
  • Solid experience with cloud platforms (AWS, GCP, or Azure) and cloud-native data engineering tools.
  • Proficiency in Python, Bash, or Scala for automation, data processing, and infrastructure scripting.
  • Hands-on experience with containerization and orchestration (Docker, Kubernetes, Helm).
  • Experience with data engineering tools, including Apache Kafka, Spark Streaming, Kinesis, Pub/Sub, or Dataflow.
  • Strong understanding of ETL/ELT workflows and distributed data processing frameworks.
Preferred Qualifications:
  • Experience working with data warehouses and lakes (BigQuery, Snowflake, Redshift, ClickHouse, S3, GCS).
  • Knowledge of monitoring and logging solutions for data-intensive applications.
  • Familiarity with security best practices for data storage, transmission, and processing.
  • Understanding of event-driven architectures and real-time data processing frameworks.
  • Certifications such as AWS Certified DevOps Engineer, Google Cloud Professional Data Engineer, or Certified Kubernetes Administrator (CKA).

Powered by JazzHR

s0AwJ4Iqbf

Job Tags

Similar Jobs

Agropur

Senior Whey Technologist Job at Agropur

Job Type: Regular Invest in you, Join Agropur. We dairy you! How Agropur invests in YOU: ~ Medical, Dental, Vision, Life, Short and Long-term Disability Insurance; ~401(k) with 7% company contributions; ~3 weeks Paid Time Off; ~ Paid holidays and...

University of California Berkeley

Assistant/Associate Professor - Francophone African literature/film and/or culture whose primary research focus lies south of the Maghreb - Department of French Job at University of California Berkeley

 ...literatures and cultures. It combines this coverage with an array of related fields and topics - from literary history and theory to philosophy, to social and cultural theory, to historical and contemporary linguistics, to the study of gender and sexuality, critical race... 

Jobot

Sales Account Executive- Building Automation Job at Jobot

 ...Job Description Job Description Building Energy Solutions firm that has been servicing the Boston market for over 25 years- Stable, Long term clientele This Jobot Job is hosted by: Courtney Hoogervorst Are you a fit? Easy Apply now by clicking the "Apply Now... 

Parexel

Senior Physician, Patient Safety (Senior Drug Safety Physician) Job at Parexel

Join our dynamic Patient Safety team as a **Senior Physician** where you'll leverage your medical expertise to ensure the safety of patients in clinical...  ..., client's guidelines and procedures, and global drug safety regulations and guidelines+ Maintaining an awareness... 

Cash-Wa

Class A CDL Route Driver or Trainee - Home Daily Job at Cash-Wa

 ...slightly depending on the route you are assigned. End times can vary between 1pm to 4pm, depending on the orders. Route drivers start the morning picking up the packed trailer and follow the delivery route given. At each stop, there will be some building of the loads to take...