11+ Web Dynpro Jobs in Mumbai | Web Dynpro Job openings in Mumbai
Apply to 11+ Web Dynpro Jobs in Mumbai on CutShort.io. Explore the latest Web Dynpro Job opportunities across top companies like Google, Amazon & Adobe.
- -Will be part of ABAP Team and will play integral role in implementation /support and other projects
- -will be interacting with the client and the key users, involved in scoping studies, suggest solution through SAP , Train the key users, and configure SAP ABAPmodule
- Should have at least one support type of project and At least two full implementation experience
- Should have experience with skills on Adobe forms, module pool programming, interface development, workflow, Webdynpro
- Will also be part of solution building team
- Should have 3+ yrs of SAP experience in ABAP Consulting
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
SENIOR DATA ENGINEER:
ROLE SUMMARY:
Own the design and delivery of petabyte-scale data platforms and pipelines across AWS and modern Lakehouse stacks. You’ll architect, code, test, optimize, and operate ingestion, transformation, storage, and serving layers. This role requires autonomy, strong engineering judgment, and partnership with project managers, infrastructure teams, testers, and customer architects to land secure, cost-efficient, and high-performing solutions.
RESPONSIBILITIES:
- Architecture and design: Create HLD/LLD/SAD, source–target mappings, data contracts, and optimal designs aligned to requirements.
- Pipeline development: Build and test robust ETL/ELT for batch, micro-batch, and streaming across RDBMS, flat files, APIs, and event sources.
- Performance and cost tuning: Profile and optimize jobs, right-size infrastructure, and model license/compute/storage costs.
- Data modeling and storage: Design schemas and SCD strategies; manage relational, NoSQL, data lakes, Delta Lakes, and Lakehouse tables.
- DevOps and release: Establish coding standards, templates, CI/CD, configuration management, and monitored release processes.
- Quality and reliability: Define DQ rules and lineage; implement SLA tracking, failure detection, RCA, and proactive defect mitigation.
- Security and governance: Enforce IAM best practices, retention, audit/compliance; implement PII detection and masking.
- Orchestration: Schedule and govern pipelines with Airflow and serverless event-driven patterns.
- Stakeholder collaboration: Clarify requirements, present design options, conduct demos, and finalize architectures with customer teams.
- Leadership: Mentor engineers, set FAST goals, drive upskilling and certifications, and support module delivery and sprint planning.
REQUIRED QUALIFICATIONS:
- Experience: 15+ years designing distributed systems at petabyte scale; 10+ years building data lakes and multi-source ingestion.
- Cloud (AWS): IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail.
- Programming: Python (preferred), PySpark, SQL for analytics, window functions, and performance tuning.
- ETL tools: AWS Glue, Informatica, Databricks, GCP DataProc; orchestration with Airflow.
- Lakehouse/warehousing: Snowflake, BigQuery, Delta Lake/Lakehouse; schema design, partitioning, clustering, performance optimization.
- DevOps/IaC: Terraform with 15+ years of practice; CI/CD (GitHub Actions, Jenkins) with 10+ years; config governance and release management.
- Serverless and events: Design event-driven distributed systems on AWS.
- NoSQL: 2–3 years with DocumentDB including data modeling and performance considerations.
- AI services: AWS Entity Resolution, AWS Comprehend; run custom LLMs on Amazon SageMaker; use LLMs for PII classification.
NICE-TO-HAVE QUALIFICATIONS:
- Data governance automation: 10+ years defining audit, compliance, retention standards and automating governance workflows.
- Table and file formats: Apache Parquet; Apache Iceberg as analytical table format.
- Advanced LLM workflows: RAG and agentic patterns over proprietary data; re-ranking with index/vector store results.
- Multi-cloud exposure: Azure ADF/ADLS, GCP Dataflow/DataProc; FinOps practices for cross-cloud cost control.
OUTCOMES AND MEASURES:
- Engineering excellence: Adherence to processes, standards, and SLAs; reduced defects and non-compliance; fewer recurring issues.
- Efficiency: Faster run times and lower resource consumption with documented cost models and performance baselines.
- Operational reliability: Faster detection, response, and resolution of failures; quick turnaround on production bugs; strong release success.
- Data quality and security: High DQ pass rates, robust lineage, minimal security incidents, and audit readiness.
- Team and customer impact: On-time milestones, clear communication, effective demos, improved satisfaction, and completed certifications/training.
LOCATION AND SCHEDULE:
● Location: Outside US (OUS).
● Schedule: Minimum 6 hours of overlap with US time zones.
About the Role:
As a Solution Design team member, you will be focused on designing solutions for some of the most challenging and exciting problems that we are working to solve in the logistics industry. You will be responsible for understanding the customer's business, supply chain challenges and building practical processes or technologically led solutions for these. This role is varied and fast-paced – constantly adapting to the logistics industry's landscape and business needs.
Key responsibilities:
- Study the customer's supply chain process and build a deeper understanding of industry-specific supply chains, their underlying & fundamental challenges, and build holistic solutions.
- Analyze data to come up with actionable insights.
- Drive implementation of complex engagements.
- Become a knowledge powerhouse within the organization for anything related to logistics.
- Develop holistic business requirements and drive product development while working with the product and technology team.
- Take ownership of complex projects, work with cross-functional teams and drive the projects to completion. Be accountable for the overall technical excellence and quality of the technical output.
- Educate and support customers, both pre-and post-sales, helping them with implementation, testing, integrations, and more.
Preferred qualifications:
- MBA with 4+ years of solid experience in Logistics.
- Good knowledge of logistics, preferably within Steel, Cement, FMCG, Transportation or related industries.
- Good to have experience working with B2B product-based organizations.
- Ability to understand the processes and cost drivers of customers from different industries.
- Strong analytical skills, with the ability to translate data into insights.
- Self motivated, result - oriented with a bias for speed and action.
- Good verbal, written, social, presentation and interpersonal skills.
- Ability to thrive in a multi-tasking environment and adjust priorities on-the-fly while still focusing on details and being analytical.
- Cold-calling to prospective leads and maintaining an Excel sheet with customer details and calling status.
- Generate new leads from existing clients.
- Independently handle the first level of conversation on chosen services
- Send mail regarding product descriptions and product white papers to prospects.
- Follow-ups over the phone and emails to win opportunities/sales.
- Convert hot leads into face-to-face or telephonic meetings.
- Assign converted meetings to sales team members by sending meeting calendars and customer credentials.
- Take the meeting status from the concerned account manager.
- Log all lead and meeting-related data and status in CRM/Tools.
- Maintain bills and coordinate with the Accounts department regarding the same.
- Maintain office supplies by checking inventory and preparing the required items list monthly.
Company Summary:
Quantsapp is India's first Option Trading Analytics platform on mobile. With ever-growing users, it makes us one of the fastest growing platforms for options trading in India. Quantsapp was to accelerate its growth even more and capture new country which requires the development team to grow.
- At Qauntsapp we are looking for a dynamic team mate to take up a role of Angular Developer.
Responsibilities and Duties:
- To develop & maintain dynamic Web App, used by thousands of traders on a daily basis
- Integration of Data representation with Dynamic Charts.
- Integration of Rest API’s & Sockets for real-time data feeds.
- Mobile optimized UI.
Required Experience and Qualifications:
- Minimum 2 years of experience with hands on experience on Angular 8+ versions and have deployed active websites.
- Strong understanding of Website development and should be able to take complete ownership of it.
- Strong UI/UX sense.
- Knowledge of AWS would be an added advantage.
Role :
- Understand and translate statistics and analytics to address business problems
- Responsible for helping in data preparation and data pull, which is the first step in machine learning
- Should be able to do cut and slice data to extract interesting insights from the data
- Model development for better customer engagement and retention
- Hands on experience in relevant tools like SQL(expert), Excel, R/Python
- Working on strategy development to increase business revenue
Requirements:
- Hands on experience in relevant tools like SQL(expert), Excel, R/Python
- Statistics: Strong knowledge of statistics
- Should able to do data scraping & Data mining
- Be self-driven, and show ability to deliver on ambiguous projects
- An ability and interest in working in a fast-paced, ambiguous and rapidly-changing environment
- Should have worked on Business Projects for an organization, Ex: customer acquisition, Customer retention.
Business development
Oem management
Warehousing management
Funding
Establishing new contacts
End to end sales cycle.
Must have -
1- Magento 2 working experience for minimum 2 years
2- Work with Magento integration systems, viz CMS/CRM/Payment gatway/shipping etc
3- Analyse, code, debug, test, document & deploy application
4- Experiance building REST API
5- Knowledge of GTM and FB integration
Good to have-
1- Magento 2 certification
2- Experience in fronted tech like HTML5, CSS3, JQuery and ReactJS
3- Experience of working on PWA apps



