Who is Snowflake Developer?
A Snowflake engineer or Snowflake Developer is basically a software developer who specializes in working with the Snowflake data platform. Snowflake is a cloud-based data warehouse that allows organizations to store, query, and analyse large amounts of data in a fast and efficient way with cloud tools like AWS, AZURE python and more.
A Snowflake developer typically has experience working with SQL and data warehousing, as well as a strong understanding of the Snowflake platform. They are responsible for designing and implementing data structures, optimizing queries, and troubleshooting performance issues. They also work on integrating Snowflake with other tools and systems such as ETL tools, data visualization tools, and business intelligence tools.
In general, Snowflake developers have knowledge in data modelling, data warehousing, SQL, and cloud computing, and can design and develop data pipelines, data marts, and data lakes in Snowflake. They also develop, test, and debug Snowflake applications and maintain and improve existing Snowflake systems.
Top 200 Snowflake Developer jobs Remote and onsite
Job Role :: Snowflake Developer
Location: Remote
Job Type :: C2C
Technical skills requirements
The candidate must demonstrate proficiency in,
- Develop and maintain applications using Snowflake services.
- Discuss solutions for problems that are not fully defined and highlight the roadblocks early in the project lifecycle.
- Collaborate and participate in the solution review call and technical meetings with other leads, engineers, and product owners.
- Strong development experience using Snowflake platform
- Design and build internal snowflake applications. Transferring data to and from the snowflake using snowsql or any ETL tool
- Developing and Tuning the Snowflake applications to improve the performance
- Intelligent Monitoring of the jobs to check the job status and resource utilization
- Scaling of the jobs to handle large data and to meet the execution time requirement
- Application of Data Quality and Validation rules at every stage of data processing
- Migration of data from on-prem to Snowflake platform
Nice-to-have skills
- Exposure to different ETL tools like Informatics, Data stage
- Snowflake certifications
- Experience with cloud computing solutions
- Migration of system from on-prem to cloud DW
Qualifications
- 5-8 Years of relevant work experience in IT/ITES industry with 2+years of snowflake experience or 3-4 years of pure snowflake experience
B.Tech., M.Tech. or MCA degree from a reputed university
Title: Snowflake Developer
Location:- Anywhere in USA (Remote)
Duration: Long Term Contract
Job Description:-
• -6+ years of experience in the IT/Technology industry
• -4 years of experience using Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data
• -Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
• -In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles
• -Experience in Data warehousing – OLTP, OLAP, Dimensions, Facts, and Data modeling.
• -Experience gathering and analyzing system requirements
• -Good working knowledge of any ETL tool (Informatica or SSIS)
• -Good to have familiarity with data visualization tools (Tableau/Power BI)
• -Ability to effectively function in a cross team’s environment
• -Good to have exposure to AWS / Azure Data ecosystem
Snowflake Developer/Admin – Remote
Remote
Responsibilities:
- Snowflake DB Developer
- 6+ years of relevant experience and skilled in – Snowflake DB, Oracle Database with hands on exposure to Cloud environment – preferably AWS
- Experience in Snow SQL, Snow Pipes, Streams, Views, Procedures, performance tuning.
- Exposure to Time travel, data movement between Snowflake to other RDBMS and vice versa
- Experienced in SQL performance tuning and troubleshooting
- Good knowledge on Python including libraries like pandas, numpy others.
- Knowledge on CI/CD using Jenkins/Bamboo
- Experience in Agile/Scrum based project executions
- Experience in orchestration using Airflow.
- Good to have python experience with hands-on.
- Very good SQL expertise in Joins, Analytical functions.
- Cloud knowledge on AWS services like S3, Lambda, Athena or Blob
- Good to have Perl scripting knowledge
- Good to have Snowflake knowledge.
- Team player, collaborative approach and excellent communication skills
- Candidate should be good in communication and articulation
Qualifications:
- Minimum qualifications Graduate/ B Tech/ MCA
- Knowledge/exp. of Banking or Capital Markets is an added advantage for this position
- Snowflake DB Developer
- 6+ years of relevant experience and skilled in – Snowflake DB, Oracle SQL with good experience on Python. preferrably hands on exposure to Cloud environment – preferably AWS
- Experience in Snow SQL, Snow Pipes, Streams
- Exposure to Time travel, data movement between Snowflake to other RDBMS and vice versa
- Exposure to orchestration using Airflow.
- Good to have experience in Python programming for ETL.
- Experienced in SQL performance tuning, analytical functions, joins and troubleshooting
- Knowledge on CI/CD using Jenkins/Bamboo
- Experience in Agile/Scrum based project executions.
Role : snowflake developer
Location : Dallas, TX -Hybrid Model
Duration : Long Term contract
What the ideal candidate looks like: AWS, Databricks, Spark, Pyspark, Java
• 5+ years of experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient and distributed data pipelines.
Experience working with ETL tools such as Databricks and Redshift
• High Proficiency in at least two of Scala, Python, Spark or Flink applied to large scale data sets.
• Strong understanding of workflow management platforms (Airflow or similar).
• Familiarity with advanced SQL.
• Expertise with big data technologies (Spark, Flink, Data Lake, Presto, Hive, Apache Beam, NoSQL, …).
• Knowledge of batch and streaming data processing techniques.
• Obsession for service observability, instrumentation, monitoring and alerting.
• Understanding of the Data Lifecycle Management process to collect, access, use, store, transfer, delete data.
• Strong knowledge of AWS or similar cloud platforms.
• Expertise with CI/CD tools (CircleCI, Jenkins or similar) to automate building, testing and deployment of data pipelines and to manage the infrastructure (Pulumi, Terraform or CloudFormation).
• Understanding of relational databases (e.g., MySQL, PostgreSQL), NoSQL databases (e.g., key-value stores like Redis, DynamoDB, RocksDB), and Search Engines (e.g., Elasticsearch). Ability to decide, based on the use case, when to use one over the other.
Title : Sr. Snowflake Developer
Location :Dallas, TX (Remote right now but expected to be onsite later)
USC/GC ONLY
Snowflake development- Teradata to Snowflake Migration, data warehouse, ETL Informatica, Data Engineering
Yes
Talview
Job Description:
Key responsibilities
• Working knowledge of data warehouse concepts
• Strong SQL skills. Expertise in writing complex SQL queries.
• Procedures, Function & Macros in SQL/PLSQL
• Experience with ETL tool – Informatica
• Experience with RDBMS – Oracle, SQL Server, Teradata. Teradata preferred
• Convert existing workflows from Informatica to Matillion.
• Experience with cloud-based technologies. Snowflake preferred.
• Create new pipelines in Matillion – Change Data Capture
• Data Modelling experience
• Performance testing and tuning of Matillion pipelines
• Create documentation of ETL processes
• Knowledge of reporting tools is a plus – Business objects, Tableau
• Should have overall 7+ years of Data Engineering experience
• Should have prior experience of Informatica/ Matillion ETL implementations
• Should have experience of creating complex SQL queries procedures & macros
• Responsibility will include development work for ETL Process i.e. redesign of existing Informatica workflows to Matillion jobs, process includes create table structures, views, Matillion jobs, unit testing, migration of jobs to higher environments Snowflake
• Experience with Data Services integration to Snowflake
• Experience with Alteryx / BOBJ integration to Snowflake
• Support end-to-end testing / reconciliation process
• Development of test scripts as required
• Experience with Python development
• Good communication skills and team player.
Task & Deliverables –
• Development of ETL Process, ETL development includes Matillion jobs, Data Transformation, Data Validation, Data Modeling, mapping documentation
• Deploy ETL Process for UAT implementation, change management, Testing of ETL Process.
• Deploy ETL Process for Production implementation, resolving issues, monitoring performance, performance tuning of SQL queries
Read more:
top 100 it staffing companies in usa
Corp to corp remote jobs
Updated bench sales hotlist
US IT recruiter vendor list
List of direct clients in USA
More Corp to corp hotlist
Join linkedin 27000+ US Active recruiters Network
Join No.1 Telegram channel for daily US JOBS and Updated HOTLISTS