This a Full Remote job, the offer is available from: United States, Maryland (USA)
Job Details…
Level: Experienced
Job Location: Remote – Baltimore, MD
Education Level: 4 Year Degree
Salary Range: Undisclosed
Description
Company Overview
Index Analytics, LLC, is a rapidly growing Baltimore-based small business providing health related consulting services to the federal government. At the center of our company culture is a commitment to instilling a dynamic and employee-friendly place to work. We place a priority on promoting a supportive and collegial team environment and enhancing staffÂ’s experience through career development and educational opportunities.
Job Overview
The Data Engineer will assist in supporting and moving the existing ETL solution into a more modernized platform while providing direct guidance to other members of the team. This includes acting as a go-to resource when technical challenges arise and providing leadership and coordination as part of a project team of data engineers, Cloud Architects, analysts, and testers to implement a Snowflake environment to ingest a wide variety of data sources to allow for data analytics.
The Data Engineer determines structural, interface, and business requirements for developing and installing solutions. This includes the design of relational databases, other types of databases, and associated interfaces, used for data storage and processing. The Data Engineer develops warehouse and data mart implementation strategies, data acquisition, and archive recovery. In pursuit of system and service optimization, the Data Engineer may perform other duties, such as investigating new data products and technology, evaluating new data sources, and reviewing existing products and products under development for adherence to the organization’s quality standards and ease of integration.
Responsibilities
• Design and onboard new ETL streams using AWS Glue and PySpark
• Migrate existing ETL streams from Databricks and Linux/Python scripts to AWS Glue
• Re-architect and incrementally improve the performance of ETL solutions without impacting the delivery of new features requested by the client
• Assist with development efforts to modernize data ingestion patterns
• Maintain existing ETL pipelines and debug pipeline failures
• Completing development tasks such as building custom reports, developing complex queries, ETL and data warehousing, and disseminating data to stakeholders.
• Cloud engineering, testing, DevOps, app support, data migrations, data loads, and scheduling jobs
• Work with DevSecOps team to set up any supporting infrastructure.
• Remediate security vulnerabilities in code identified through static code analysis or environment scanning.
• Data Modeling to support ingestion of a wide variety of CMS data sources and requirements from data analysts and other stakeholders
• Using Python and/or Linux shell to perform file management and other scripting tasks
• Optimizing existing processes to improve performance
• Work closely with product owners and DevOps to ensure compliance with SDLC processes
• Collaborate with business analysts to gather requirements, develop, and document business rules, create test scenarios to ensure properly working code, and communicate technical concepts for transparency
Qualifications
• US citizen or Authorized to work and lived in the US for 3 years out of the last 5 years.
• 5 years of relevant experience
• Bachelor’s Degree or equivalent OR 4 years’ relevant experience in lieu of degree
• Strong experience with Cloud platforms such as AWS
• Experience with AWS Glue a plus
• Very strong experience with Python or PySpark (2+ years)
• Writing Python or PySpark code for ETL processes
• Experience with AWS Lambda a plus.
• Experience working with semi-structured (XML, JSON, PARQUET) data
• Strong experience with SQL
• Deployment automation experience via CI/CD tools such as Aws Code Pipeline is a plus
• Experience with version control, preferably with GitHub
• Experience working within an Agile development environment or development and testing activities.
• Working knowledge of database security, audit, and RBAC controls
• Knowledge of Snowflake cloud database platform
• Hands-on experience with database performance tuning, clustering key analysis, sizing, and cost optimization
• SnowPro certification or AWS certifications are a plus
• Experience analyzing data and presenting information to stakeholders.
• Ability to obtain Public Trust level clearance.
Index Analytics provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Attention Candidates:
We’re dedicated to ensuring a safe and transparent recruitment process for all candidates and have implemented robust measures to protect your personal information. Please be aware that all employment-related communications will originate from a secure portal ([email protected]) or a corporate email address ([email protected]). If you have any concerns, please don’t hesitate to reach out to us at [email protected].
This offer from “Index Analytics LLC” has been enriched by Jobgether.com and got a 76% flex score