Duration: Long Term
Location: Fort mill, SC- Onsite
Â
Note : We Need Only Banking or Finance EXP CandidatesÂ
Â
Key Responsibilities
- Design & execute test plans for ETL/ELT workflows, data feeds, and batch extracts, ensuring comprehensive data validation
- Write and maintain manual and automated tests for data accuracy, completeness, and consistency against mapping documents and specification rules
- Develop SQL-based validation logic—supporting complex queries, joins, aggregates, and stored procedures—to verify data correctness
- Collaborate with data engineering, BI, architecture teams to identify and resolve data quality issues across data pipelines
- Develop or support CI/CD pipelines for automated data quality checks and data-flow validations
- Analyse test results, document defects, track resolution, and report status to stakeholders, act as advocate for quality in an Agile delivery processÂ
Required Skills & Experience
- Bachelor’s or master’s degree in computer science, Engineering, or related field
- 5–6+ years of experience in QA, SDET, or data quality roles—preferably in data engineering/test test automation in the financial services industry
- Strong programming and automation experience in Python or Java, with focus on testing frameworks and automation tools (e.g. Selenium, TestNG, Appium)
- Advanced SQL skills, capable of building complex queries to validate ETL, data loads, and data transformations
- Hands‑on with ETL/Database testing—manual & automated—covering batch pipelines, API integrations, and data feeds
- Experience with AWS-based data environments (e.g. S3, RDS, DynamoDB, Lambda, Snowflake, Redshift) and familiar in working with data lakes
- Knowledge of CI/CD tools (Jenkins, TeamCity, GitHub Actions, Octopus), test automation in pipelines, and Agile methodologies including Scrum ceremonies
Preferred Qualifications
- Familiarity with data quality tools, metrics frameworks, rule validation engines, or profiling platforms
- Exposure to big data technologies—such as Hadoop, Snowflake, DBT, Kafka, DBT, Airflow, FiveTran, PySpark—for data testing contexts
- Understanding of dimension modelling (star schema, SCD), data governance standards, metadata management, and change data capture patterns
- Financial services domain experience and familiarity with regulatory/operational compliance testing contexts
Â
Â
Best,
Kavya Reddy
Email: kavya@dvgts.com
Direct : (609) 596-2421
Desk: (609) 888-6198 Ext: 139
DVG Tech Solutions LLC.