7+ Workflow Jobs in Mumbai | Workflow Job openings in Mumbai
Apply to 7+ Workflow Jobs in Mumbai on CutShort.io. Explore the latest Workflow Job opportunities across top companies like Google, Amazon & Adobe.
Position: QA Engineer – Machine Learning Systems (5 - 7 years)
Location: Remote (Company in Mumbai)
Company: Big Rattle Technologies Private Limited
Immediate Joiners only.
Summary:
The QA Engineer will own quality assurance across the ML lifecycle—from raw data validation through feature engineering checks, model training/evaluation verification, batch prediction/optimization validation, and end-to-end (E2E) workflow testing. The role is hands-on with Python automation, data profiling, and pipeline test harnesses in Azure ML and Azure DevOps. Success means probably correct data, models, and outputs at production scale and cadence.
Key Responsibilities:
Test Strategy & Governance
- ○ Define an ML-specific Test Strategy covering data quality KPIs, feature consistency
- checks, model acceptance gates (metrics + guardrails), and E2E run acceptance
- (timeliness, completeness, integrity).
- ○ Establish versioned test datasets & golden baselines for repeatable regression of
- features, models, and optimizers.
Data Quality & Transformation
- Validate raw data extracts and landed data lake data: schema/contract checks, null/outlier thresholds, time-window completeness, duplicate detection, site/material coverage.
- Validate transformed/feature datasets: deterministic feature generation, leakage detection, drift vs. historical distributions, feature parity across runs (hash or statistical similarity tests).
- Implement automated data quality checks (e.g., Great Expectations/pytest + Pandas/SQL) executed in CI and AML pipelines.
Model Training & Evaluation
- Verify training inputs (splits, windowing, target leakage prevention) and hyperparameter configs per site/cluster.
- Automate metric verification (e.g., MAPE/MAE/RMSE, uplift vs. last model, stability tests) with acceptance thresholds and champion/challenger logic.
- Validate feature importance stability and sensitivity/elasticity sanity checks (price/volume monotonicity where applicable).
- Gate model registration/promotion in AML based on signed test artifacts and reproducible metrics.
Predictions, Optimization & Guardrails
- Validate batch predictions: result shapes, coverage, latency, and failure handling.
- Test model optimization outputs and enforced guardrails: detect violations and prove idempotent writes to DB.
- Verify API push to third party system (idempotency keys, retry/backoff, delivery receipts).
Pipelines & E2E
- Build pipeline test harnesses for AML pipelines (data-gen nightly, training weekly,
- prediction/optimization) including orchestrated synthetic runs and fault injection
- (missing slice, late competitor data, SB backlog).
- Run E2E tests from raw data store -> ADLS -> AML -> RDBMS -> APIM/Frontend, assert
- freshness SLOs and audit event completeness (Event Hubs -> ADLS immutable).
Automation & Tooling
- Develop Python-based automated tests (pytest) for data checks, model metrics, and API contracts; integrate with Azure DevOps (pipelines, badges, gates).
- Implement data-driven test runners (parameterized by site/material/model-version) and store signed test artifacts alongside models in AML Registry.
- Create synthetic test data generators and golden fixtures to cover edge cases (price gaps, competitor shocks, cold starts).
Reporting & Quality Ops
- Publish weekly test reports and go/no-go recommendations for promotions; maintain a defect taxonomy (data vs. model vs. serving vs. optimization).
- Contribute to SLI/SLO dashboards (prediction timeliness, queue/DLQ, push success, data drift) used for release gates.
Required Skills (hands-on experience in the following):
- Python automation (pytest, pandas, NumPy), SQL (PostgreSQL/Snowflake), and CI/CD (Azure
- DevOps) for fully automated ML QA.
- Strong grasp of ML validation: leakage checks, proper splits, metric selection
- (MAE/MAPE/RMSE), drift detection, sensitivity/elasticity sanity checks.
- Experience testing AML pipelines (pipelines/jobs/components), and message-driven integrations
- (Service Bus/Event Hubs).
- API test skills (FastAPI/OpenAPI, contract tests, Postman/pytest-httpx) + idempotency and retry
- patterns.
- Familiar with feature stores/feature engineering concepts and reproducibility.
- Solid understanding of observability (App Insights/Log Analytics) and auditability requirements.
Required Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
- 5–7+ years in QA with 3+ years focused on ML/Data systems (data pipelines + model validation).
- Certification in Azure Data or ML Engineer Associate is a plus.
Why should you join Big Rattle?
Big Rattle Technologies specializes in AI/ ML Products and Solutions as well as Mobile and Web Application Development. Our clients include Fortune 500 companies. Over the past 13 years, we have delivered multiple projects for international and Indian clients from various industries like FMCG, Banking and Finance, Automobiles, Ecommerce, etc. We also specialise in Product Development for our clients.
Big Rattle Technologies Private Limited is ISO 27001:2022 certified and CyberGRX certified.
What We Offer:
- Opportunity to work on diverse projects for Fortune 500 clients.
- Competitive salary and performance-based growth.
- Dynamic, collaborative, and growth-oriented work environment.
- Direct impact on product quality and client satisfaction.
- 5-day hybrid work week.
- Certification reimbursement.
- Healthcare coverage.
How to Apply:
Interested candidates are invited to submit their resume detailing their experience. Please detail out your work experience and the kind of projects you have worked on. Ensure you highlight your contributions and accomplishments to the projects.
Required Qualifications
- Bachelor’s degree Commerce background / MBA Finance (mandatory).
- 3+ years of hands-on implementation/project management experience
- Proven experience delivering projects in Fintech, SaaS, or ERP environments
- Strong expertise in accounting principles, R2R (Record-to-Report), treasury, and financial workflows.
- Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
- Experience working with ETL pipelines or data migration processes
- Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
- Strong communication and stakeholder management skills
- Ability to manage multiple projects simultaneously and drive client success
Preferred Qualifications
- Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
- Familiarity with API integrations and basic data mapping
- Experience in agile/scrum-based implementation environments
- Exposure to reconciliation, book closure, AR/AP, and reporting systems
- PMP, CSM, or similar certifications
Job Title : GoldenSource Technical Consultant
Experience : 7+ Years
Location : Pan India (Hybrid)
Notice Period : Immediate Joiners or Serving max. 10 to 15 Days
Interview Mode : Virtual
Job Description :
We are looking for a skilled GoldenSource Technical Consultant / Developer with extensive experience in implementing and customizing GoldenSource v8.x for financial institutions. The ideal candidate should possess strong technical knowledge of security reference data management and hands-on experience working with market data vendors.
Mandatory Skills :
GoldenSource v8.x, Security Master, Workflow & UI Development, Bloomberg/Reuters Integration, Oracle/PostgreSQL, JBoss/JMS.
Key Responsibilities :
- Lead and implement GoldenSource v8.x in financial services environments.
- Deep understanding of GoldenSource Security Master v8.x, including architecture and data model.
- Design and develop workflows, connectors, and UI components within GoldenSource.
- Manage and handle Security Reference Data effectively.
- Integrate with data vendors like Bloomberg, Reuters, etc.
- Understand and apply knowledge of financial instruments and capital markets.
- Work with RDBMS platforms such as Oracle or PostgreSQL.
- Collaborate using tools such as JIRA, Confluence, and middleware technologies like JBoss and JMS.
Key Responsibilities:
Design, develop, and optimize scalable data pipelines and ETL processes.
Work with large datasets using GCP services like BigQuery, Dataflow, and Cloud Storage.
Implement real-time data streaming and processing solutions using Pub/Sub and Dataproc.
Collaborate with cross-functional teams to ensure data quality and governance.
Technical Requirements:
4+ years of experience in Data Engineering.
Strong expertise in GCP services like Workflow,tensorflow, Dataproc, and Cloud Storage.
Proficiency in SQL and programming languages such as Python or Java
.Experience in designing and implementing data pipelines
and working with real-time data processing.
Familiarity with CI/CD pipelines and cloud security best practices.
- Strong expertise in Salesforce.com configuration modules like Roles, Profiles, Security & User Setup, Custom Objects, Tabs, Workflows, Approval Processes, Reports & Dashboards.
- Strong on technical proficiency in Salesforce.com tools and features like SOQL, Apex API, Visual Force, Triggers, Web Services, Batch Apex.
- Ability to leverage experience with web services and the cloud to build integration points between Salesforce and External systems.
- Experience in migration of objects/metadata from one (development/sandbox) organization to another (Sandbox or production) using Change-Sets, Eclipse IDE.
- Good problem-solving skills and ability to give optimal solutions.
- Lightning component development, Lightning design development.
- Salesforce platform experience (Sales Cloud, Service Cloud, General Configuration, etc…)
- com development experience (APEX, Visualforce, Portals / Communities)
- Deep understanding with technical capabilities of Visual Force, APEX APIs, APEX Triggers, and APEX Web services.
- Excellent written, verbal presentation and organizational skills, ability to interface with all levels and business units.
- Must work independently in complex fast paced environment to ensure quality and timeliness of system information.
- com Certifications (Developer and Architect) is additional advantage.
- 2+ years of experience
- Excellent technical skills on Salesforce or similar applications which include building customized applications using JAVA, MIcroservices, Apex classes and Triggers, Visualforce Pages, Apex Jobs, Lightning Component, Integration, Process Builder or configure components/Flexi pages to implement expected functionality.
- Worked with different tools like Copado, Data loader, Jira, VS Code, Eclipse, SonarQube, etc.
- Able to Manage deployment activities, code review, and deployment using data loader
- Implementation of Entitlement and Milestones, platform events, visual flows, etc.
- Implement web services using REST API/SOAP API.
- Inbound / Outbound Integration.
- Involvement in Solution designing and Estimations.
- Create Email Templates, Page Layouts, Record Types, Custom Fields, Lightning Record Pages, etc.
- Performed pre-deployment and post-deployment activities.
- Experience in Workflows, custom metadata
- Implementing DevOps in Enterprise application
- For Support Developer – Problem-solving mindset
Good to have –
- BFSI industry experience
- Agile project management experience
Oracle Apps Technical :
Experience Required : 2 - 6 Years
Job Location : Thane
Role and Responsibilities :
- Check the Cmg's and take the requirements from functional consultants and users.
- As per business logic (MD50) do change existing reports and develop the new reports.
- Developing new packages, procedures, functions and updating existing packages and procedures, functions.
- Developing the workflows.
- Developing the API's as per business logic's for data updates.
- Developing the data extraction automation programs (.sql files developments) for data extractions.
- Prepare the test cases and migration documents for migrating objects.
Education and experience :
- Bachelor's Degree in IT/ Computers/ Science or Master's in Computer required.
- 2 - 5 years of relevant experience on Oracle Apps Technical.
- Candidates must have hands on experience on Oracle RDF/XML/Pl-SQL Reports, D2k Forms, Workflow, OAF, And functional knowledge on O2C & P2P.


