
Tech Interacts Inc
Client: Wells Fargo
Duration: 24+ Months (s) Contract – 24 months, not likely to convert
Location: 300 S Brevard St., Charlotte, NC – 28202 – Hybrid Role
W2
Interview process:
Interview process: 30 minutes virtual interview
90 minutes on site interview
Job Descriptions:
In this contingent resource assignment, you may: Consult on complex initiatives with broad impact and large-scale planning for Software Engineering.
Review and analyze complex multi-faceted, larger scale or longer-term Software Engineering challenges that require in-depth evaluation of multiple factors including intangibles or unprecedented factors.
Contribute to the resolution of complex and multi-faceted situations requiring solid understanding of the function, policies, procedures, and compliance requirements that meet deliverables.
Strategically collaborate and consult with client personnel.
Required Qualifications:
5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work or consulting experience, training, military experience, education.
Top Skills
5 years of experience as an engineer
5 years of SQL engineering
2-3 years Pyspark
2-3 years Iceberg
2-3 years Parquet
S3
Airflow
Application is legacy Abinitio, transitioning to Python and Pyspark. Also replacing Autosys with Airflow for job scheduling. Trying to upscale to growing demand for data with the asset cap lifting. This person will be working on the new project after first learning the existing platform. On a team with 7-8 onshore engineers and 6-7 offshore.
Will do heads down engineering but needs to be able to interact with the existing team and coordinate with them on needs when implementing the new system. Will by doing daily scrum calls, reviewing work orders in order.
Agile is not necessary but good to know.
Banking background is good but not a must.
Certifications are a plus.
JOB DESCRIPTION
About the Role
We are seeking a highly skilled and adaptable Lead Software Engineer to join our Counterparty Credit Risk organization.
This role supports: Data Services, with a focus on modernizing legacy systems, managing high-volume data pipelines, and contributing to full-stack application development.
You will a team member on business-as-usual (BAU) processes, while also contributing to the development of a new data platform over the next two years. The ideal candidate is a strong communicator, a proactive problem-solver, and comfortable working in a Kanban Agile environment.
Key Responsibilities:
Lead Agile Development: Guide and support multiple Agile teams focused on data Extract , Ingestion, and transformation.
Modernize Legacy Systems: Migrate data pipelines from Ab Initio and Filesystem to modern technologies such as PySpark, S3, Airflow, Parquet, and Iceberg.
Full-Stack Engineering: Design and develop scalable backend services using PySpark and Python.
Data Platform Enablement: Support ingestion of 300+ data feeds into the platform to ensure timely nightly batch processing.
Cross-Functional Collaboration: Partner with business stakeholders and product owners to understand requirements and deliver effective solutions.
Agile Execution: Working with both Kanban and scrum teams and should be familiar with both and check-ins and managing tasks via Jira.
Mentorship and Communication: Provide technical guidance, foster collaboration, and proactively seek help when encountering blockers.
Platform Transition Support: Contribute to the migration from legacy systems to a new data platform over the next two years.
BAU and Strategic Support: Balance business-as-usual responsibilities while contributing to long-term platform modernization.
Documentation and Data Modeling: Maintain clear technical documentation and demonstrate a strong understanding of columnar data structures.
Experience on the different file format systems (Parquet, ORC, AVRO).
Experience on the code containerization deployments using Docker / Kubernetes.
Java background would be a plus.
Good Knowledge on large scale ETL Based frameworks
Experience on ETL tool (AbInitio).
Required Skills & Experience
Top Technical Skills
3+ years of experience with PySpark, S3, Iceberg, Git, Python, Airflow, and Parquet
5+ years of experience with SQL
Experience with Agile methodologies and tools like Jira
Familiarity with Kafka
Experience with GitHub Copilot, Web Services, Visual studio, IntelliJ, and Gradle
Experience in monitoring tools like Grafana or Prometheus
Preferred Qualifications
Proven experience leading Agile teams and mentoring junior developers
Strong communication skills and the ability to collaborate with business stakeholders
Comfortable working in both Scrum and Kanban model with frequent scrum check-ins
Ability to identify blockers and proactively seek help when needed
Experience working in a regulated environment with a focus on compliance and data governance.
2+ years of working with Ab Initio graphs and plans
Team Structure & Projects
You will be part of a team that handles over 300+ data feeds, ensuring timely ingestion for nightly batch processing.
Role will focus on Data Services, modernizing data ingestion pipelines.
To apply for this job email your details to sara@techinteracts.com