Hope you are doing well !!
This is Amritanshu from Himflax Information Technologies Inc. We have an urgent requirement. Please review the below job description and If you are interested Please let me know. Please share your updated Resume with me at my Email: amritanshu@himflax.com or you can directly reach me at +1 551-201-9151.
|
Business Continuity Platform Administrator |
Lehi, UT / San Jose, CA (Hybrid) |
|
UX-Program Manager |
Lehi, UT / San Jose, CA (Hybrid) |
|
Technical Data Analyst |
San Jose, CA (Hybrid) |
|
ServiceNow HR SD Product Analyst |
San Jose, CA (Hybrid) |
|
Cloud Architect |
San Jose, CA (Hybrid) |
|
Data Architect |
San Jose, CA (Hybrid) |
Visas Note: No OPT / TN Visa/ GC (only 14+)
(1) Job description
Job ID: 2117
Job Title: Business Continuity Platform Administrator
Location: Lehi, UT (Hybrid) San Jose, CA (Hybrid)
Job Description:
Key Skills
- Business Continuity
- Disaster Recovery
- ServiceNow Admin exp is a must to have
- Understand how ServiceNow works as a platform
- Understand E2E in BCM
- Ability to understand the basic principles of Resilience
- Worked on SaaS based environment
- Lehi, UT / San Jose, CA
- 3 – 5 years of exp is ideal
Join Client’s Enterprise Resilience team as a Business Continuity Platform Administrator, where you’ll help ensure operational resilience by managing and optimizing our Business Continuity Management platform and supporting critical continuity initiatives. This role offers the chance to work closely with our Business stakeholders on projects that improve the resiliency of critical business functions. Working with the Enterprise Resilience team, you will play a pivotal role in maintaining our business continuity management program and software platform.
What you'll Do
- Manage and audit user roles, permissions, and access requests to ensure compliance and platform security.
- Act as the primary point of contact for process owners and users, providing prompt assistance for issues.
- Triage, troubleshoot, and escalate technical problems to appropriate teams when necessary.
- Monitor and respond to inquiries through the designated communication channel, ensuring timely support.
- Deliver clear guidance and support to users across business units, fostering a positive experience.
- Develop and maintain training resources, including user guides, FAQs, and video tutorials.
- Conduct onboarding and refresher sessions to drive platform adoption and literacy.
- Analyze user engagement metrics to identify trends and recommend improvements for adoption.
- Identify and implement workflow enhancements to improve user experience and efficiency.
- Maintain accurate documentation of platform processes, configurations, and user guides.
- Support reporting needs related to platform usage, issues, and enhancements.
- Assist with Business Continuity projects as required.
What you need to succeed
- 3–5 years of experience in business continuity, risk management, or platform administration.
- Hands-on experience with SaaS platforms and providing user support in an enterprise environment.
- Knowledge of ISO 22301 or related standards, and/or a solid understanding of the Business Continuity Management lifecycle and IT concepts.
- Proficiency in key tools, including Microsoft Office (Word, Excel, PowerPoint, Access, SharePoint), ServiceNow, Jira, and other collaboration platforms.
- Strong communication skills, with the ability to deliver clear, concise, and compelling analysis, documentation, and presentations.
- Proven ability to collaborate effectively with cross-functional teams across the organization.
- Excellent organizational and administrative skills, including experience developing project plans and meeting deadlines.
- Self-starter and team player, capable of working independently with minimal supervision.
- Preferred qualifications: Undergraduate degree or equivalent experience, and professional certifications such as CBCP or MBCI.
(2) Job description
Job Title: UX-Program Manager
Location: Lehi, UT (Hybrid) , San Jose, CA
HM NOTES:
- 40% UX
- 35% – Change Management
- 25% – Program Management
- please look for a UX Designer/ Product Designer with exp in Change Management and PM in the above proportion
HM is open for even remote candidates if the candidate is exceptionally good
Job Description:
Senior UX Program Manager, (Data Governance)
What You’ll Do
- Lead product discovery and UX for L&C (portal and APIs) end-to-end: user research, journey mapping, information architecture, interaction design, prototyping in Figma, and design QA for features like RBAC, classifier/label management, and governance insights.
- Own program delivery for cross-functional increments—define scope, RACI/DACI, roadmaps, and SLC gates; run sprint and release rituals; remove blockers across Security, POCL, and product engineering.
- Drive change management: craft enablement assets (how-to guides, API recipes, short videos), partner roadshows, and rollout plans that turn “available” into “adopted,” especially for Security data owners and BU data stewards.
- Instrument adoption and value: define and publish a lean KPI set (e.g., RBAC coverage, time-to-label approval, % of high-risk buckets classified, audit controls closed) and run quarterly “Leadership Outcomes” updates.
- Partner with compliance leaders to ensure user experiences align with Adobe’s Data Classification & Handling Standard and privacy obligations (e.g., GDPR/CCPA), and to simplify audit evidence creation.
- Raise the design bar: facilitate critiques, maintain a component library and content style guidance, and ensure accessibility, responsiveness, and internationalization from day one.
What You Need to Succeed
- 8+ years in UX/Product Design (or UX lead roles) with a portfolio built in Figma that showcases complex, data-heavy, workflow products.
- 3–5+ years delivering program management outcomes in technical domains (platform, data, or security), including multi-team roadmaps and measurable impact.
- Proven change manager: experience designing rollout strategies, communications, and training that drive behavior change across large, distributed orgs.
- Strength in user research (interviews, co-design, usability testing), IA, and content design for expert users (e.g., Security, Privacy, platform engineers).
- Fluency collaborating with Security (CSO) and POCL-type stakeholders or equivalents in highly regulated contexts.
- Working knowledge of RBAC concepts, data classification/labeling, and governance workflows; comfort translating policy into intuitive product behaviors.
- Tools: Figma, FigJam, Jira/Confluence or Workfront, M365; bonus for analytics/telemetry tools and basic familiarity with cloud data ecosystems (e.g., buckets, catalogs).
Nice to Have
- Background in data governance, privacy, or security products.
- Experience shipping enterprise admin and self-service experiences (portals + APIs).
- Familiarity with Adobe’s privacy and governance stakeholder landscape (Security/CSO, POCL) and steering structures.
(3) Job description
Job Title: Technical Data Analyst
Location: San Jose, CA – Hybrid ( 2 – 3 days in office)
Data Analyst / Data Product Manager
Migration is involved
- Technically good at Data, Field Mapping
- Hands on Data Analysis
- SQL, Python, Power BI
- Building integrations between SFDC, Dynamics, SAP ECC and DBX for Power BI reporting and financial metrics.
- Migration from Salesforce.com to Dynamics 365 – MUST
Job Description:
The Opportunity
I am looking for a Technical Data Analyst who has hands on experience migrating Sales users and Sales data like Accounts, Opportunities, Contacts from Salesforce.com to Dynamics 365. He should also have strong experience in data profiling, data integration, transformations, understanding how data impacts downstream systems like Sales and Finance data warehouses/data lakes, reporting etc.
What You’ll Do
- Work with Adobe internal data teams and business teams to decipher the data related requirements for the project
- Need to have hands on experience in understanding the SFDC to Dynamics field mappings & data models.
- Design end to end strategy to stitch objects from various systems and financial KPIs
- Building integrations between SFDC, Dynamics, SAP ECC and DBX for Power BI reporting and financial metrics.
- Leverage data sources across the enterprise to build sophisticated and insightful analyses and data models for Sales, Finance and Marketing
- Need to have hands on experience with migrating Marketing and Sales data and users from Salesforce.com to Dynamics
- Work with the Product Managers to build detailed data requirements/specifications for Engineering teams to build the solution in downstream data management and reporting systems.
- Need to understand the migration challenges from similar experiences and build creative solutions to help with migrating data from SFDC to Dynamics.
- Consolidate requirements and suggest building new reporting capabilities for analysis using advanced BI techniques and tools.
- Proactively collaborate with various product managers to bring a perspective on all data we work on.
- Conducts QA testing and validations and provides inputs to the Engineering teams – along with the PdMs
- Support Release Planning, scheduling backlog items into regular releases aligned to business priority while working with the PdM’s
- Supports production cutover and Production acceptance testing.
- Supports post go live sessions with business, addresses and drives technical issues raised during Hyper Care. This will be done with the PdM’s and Engineering teams.
- Qualifications
- Requires bachelor’s degree. Preferred candidates will have a major in computer science, MBA from reputable institution or equivalent experience.
- 4+ years of data analytics, ‘data BSA’ or data product management experience with solid understanding of how to deliver data solutions in an agile environment.
- Strong proficiency in SQL/SparkSQL/Python to query and manipulate large data sets. Experience with platforms like Databricks, Power BI and Tableau.
- You are a self-starter, independent, hard worker, with a high degree of motivation to go above and beyond the task at hand. You anticipate and creatively implement next steps in a complex environment.
- You have mastered the ability to influence outcomes, navigate, mediate to consensus with integrity. You possess great interpersonal communication, presentation skills, and social skills and a solid sense of humor.
- Data requirement writing skills: collecting, prioritizing, and gathering input from multiple sources, providing accurate requirements with attention to detail.
- You already know or can rapidly learn enterprise application capabilities in order to deliver transaction and event-driven data solutions (examples: SAP/HANA, MS Dynamics or Salesforce Data, ADLS/Hadoop/Databricks datalake/lakehouse solutions, and/or Kafka streams)
(4) Job description
Job Title: HRSD Product Analyst
Location: San Jose, CA (Hybrid)
ServiceNow HR Service Delivery
Job Summary
As an HRSD Product Analyst, you will play a critical role in supporting the HRSD Product Manager during run the business (RTB) activities, as well as project support. You will be able to get vast experience within various applications in ServiceNow, but primarily be working under the HRSD (Employee Workflows) scope and applications gathering, documenting, testing, and releasing core functionality for the business teams across our Employee Experience (EX) function at Adobe, such as Payroll, Equity, Benefits, HR, Global Mobility, etc.
What You’ll Do
- Co-lead discovery with HR COEs (Payroll, Benefits, Leave, ER, Mobility) to map current vs target journeys, intake, and SLAs; rationalize services to a single COE each.
- Co-author PRDs, epics, and user stories with acceptance criteria
- Design per-COE service components: catalog items/record producers, case/task templates, SLAs, notifications, Quick Messages, and workspace layouts.
- Build routing models: AWA queues, skills, schedules, capacity; remove double-routing; define exception/overflow logic.
- Write functional specs for HRSD and HR Agent Workspace (UI Builder pages, record headers, side panels, related lists)
- Partner on integrations (Workday/Payroll/Equity/Identity): field mappings, source-of-truth, error handling, observability
- Author ATF test cases for critical paths; define Instance Scan rules; support UAT by persona and accessibility checks.
- Support reporting and dashboard creation and maintenance
- Support UAT and hypercare
- Operate in CI/CD: update set/source control hygiene
- Maintain RAID/logs
- Business stakeholder management and communications for RTB and project requirements
- Rationalizing business requirements and translating them into technical requirements
Certifications:
- ServiceNow System Admin certification (must)
- HRSD Implementation certification (preferred)
- Hands-on experience in HRSD configurations (must)
Functional:
- 4+ years Product/Business Analysis with 2+ in ServiceNow HRSD
- Strong HRSD architecture knowledge and hands-on experience
- Ability to write precise requirements, epics, stories, and acceptance criteria
- Experience using Jira
(5) Job description
Title: Cloud Architect and Data Architect (Two roles)
Location: San Jose , CA (hybrid) 3 days oniste 2 days remote
Note – Bachelor's degree – B. Tech in Computer Science Must
Job Description:
- The Cloud Architect will be a key contributor to designing, evolving, and optimizing our company's cloud-based data architecture. This role requires a strong background in data engineering, hands-on experience building cloud data solutions, and a talent for communicating complex designs through clear diagrams and documentation. Must work EST hours.
Strategy, Planning, and Roadmap Development: Align AI and ML system design with broader business objectives, shaping technology roadmaps and architectural standards for end-to-end cloud-driven analytics and AI adoption.
Designing End-to-End AI/ML Workflows: Architect and oversee all stages of AI/ML pipeline development—data ingestion, preprocessing, model training, validation, deployment, monitoring, and lifecycle management within cloud environments.
Selecting Technologies and Services: Evaluate and choose optimal cloud services, AI/ML platforms, infrastructure components (compute, storage, orchestration), frameworks, and tools that fit operational, financial, and security requirements.
Infrastructure Scalability and Optimization: Design and scale distributed cloud solutions capable of supporting real-time and batch processing workloads for AI/ML, leveraging technologies like Kubernetes, managed ML platforms, and hybrid/multi-cloud strategies for optimal performance.
MLOps, Automation, and CI/CD Integration: Implement automated build, test, and deployment pipelines for machine learning models, facilitating continuous delivery, rapid prototyping, and agile transformation for data and AI-driven products.
Security, Compliance, and Governance: Establish robust protocols for data access, privacy, encryption, and regulatory compliance (e.g., GDPR, ethical AI), coordinating with security experts to continuously assess risks and enforce governance.
Business and Technical Collaboration: Serve as the liaison between business stakeholders, development teams, and data scientists, translating company needs into technical solutions, and driving alignment and innovation across departments.
Performance Evaluation & System Monitoring: Monitor infrastructure and AI workloads, optimize resource allocation, troubleshoot bottlenecks, and fine-tune models and platforms for reliability and cost-efficiency at scale.
Documentation and Best Practices: Create and maintain architectural diagrams, policy documentation, and knowledge bases for AI/ML and cloud infrastructure, fostering a culture of transparency, learning, and continuous improvement.
Continuous Innovation: Stay abreast of new technologies, frameworks, trends in AI, ML, and cloud computing, evaluate emerging approaches, and lead strategic pilots or proofs-of-concept for next-generation solutions.
This role blends leadership in technology and systems architecture with hands-on expertise in cloud infrastructure, artificial intelligence, and machine learning, pivotal for driving innovation, scalability, and resilience in a modern enterprise.
Required Qualifications
- Bachelor's degree in Computer Science, Data Science, Information Systems, or a related field.
- Minimum of 5 years of hands-on data engineering experience using distributed computing approaches (Spark, Map Reduce, DataBricks)
- Proven track record of successfully designing and implementing cloud-based data solutions in Azure
- Deep understanding of data modeling concepts and techniques.
- Strong proficiency with database systems (relational and non-relational).
- Exceptional diagramming skills with tools like Visio, Lucidchart, or other data visualization software.
Preferred Qualifications
- Advanced knowledge of cloud-specific data services (e.g., DataBricks, Azure Data Lake).
- Expertise in big data technologies (e.g., Hadoop, Spark).
- Strong understanding of data security and governance principles.
- Experience in scripting languages (Python, SQL).
Additional Skills
- Communication: Exemplary written and verbal communication skills to collaborate effectively with all teams and stakeholders.
- Problem-solving: Outstanding analytical and problem-solving skills for complex data challenges.
- Teamwork & Leadership: Ability to work effectively in cross-functional teams and demonstrate potential for technical leadership.
Thank you !
Best Regards,
Amritanshu Ratna
Technical Recruiter
Himflax Information Tech INC.
USA : 3240,East State Street Ext, Ste #3, Hamilton NJ 08619
Email: amritanshu@himflax.com URL: www.himflax.com
LinkedIn : linkedin.com/in/amritanshu-ratna-802000212
Office Cell No .
Contact No.: +15512019151, EXT-129
To unsubscribe from future emails or to update your email preferences click here