11+ DMS Jobs in Delhi, NCR and Gurgaon | DMS Job openings in Delhi, NCR and Gurgaon
Apply to 11+ DMS Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest DMS Job opportunities across top companies like Google, Amazon & Adobe.
AWS Glue Developer
Work Experience: 6 to 8 Years
Work Location: Noida, Bangalore, Chennai & Hyderabad
Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops,
Job Reference ID:BT/F21/IND
Job Description:
Design, build and configure applications to meet business process and application requirements.
Responsibilities:
7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.
Technical Experience:
Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.
➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.
➢ Create data pipeline architecture by designing and implementing data ingestion solutions.
➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.
➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.
➢ Author ETL processes using Python, Pyspark.
➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.
➢ ETL process monitoring using CloudWatch events.
➢ You will be working in collaboration with other teams. Good communication must.
➢ Must have experience in using AWS services API, AWS CLI and SDK
Professional Attributes:
➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.
➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.
➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.
Qualification:
➢ Degree in Computer Science, Computer Engineering or equivalent.
Salary: Commensurate with experience and demonstrated competence
Hybrid work mode
(Azure) EDW Experience working in loading Star schema data warehouses using framework
architectures including experience loading type 2 dimensions. Ingesting data from various
sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.
Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),
Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
REQUIREMENTS:
- Bachelor's/master’s degree or equivalent experience in computer science
- Overall, 10-12 years of experience with at least 4 years of experience with Jitterbit Harmony platform and Jitterbit Cloud.
- Should have the experience to technically lead groom developers who might be geographically distributed
- Knowledge of Change & Incident Management process (JIRA etc.)
RESPONSIBILITIES:
- Responsible for end-to-end implementation of integration use case using Jitterbit platform.
- Coordinate with all the stakeholders for successful project execution.
- Responsible for requirement gathering, Integration strategy, design, implementation etc.
- Should have strong hands-on experience in designing, building, and deploying integration solution using Jitterbit harmony Platform.
- Should have developed enterprise services using REST based APIs, SOAP Web Services and use of different Jitterbit connectors (Salesforce, DB, JMS, File connector, Http/Https connectors, any TMS connector).
- Should have knowledge of Custom Jitterbit Plugins and Custom Connectors.
- Experience in Jitterbit implementations including security, logging, error handling, scalability and clustering.
- Strong experience in Jitterbit Script, XSLT and JavaScript.
- Install, configure and deploy solution using Jitterbit.
- Provide test support for bug fixes during all stages of test cycle.
- Provide support for deployment and post go-live.
- Knowledge of professional software engineering practices & best practices for the full software development life cycle including coding standards, code reviews, source control management, build processes, testing,
- Understand the requirements, create necessary documentation, give presentations to clients and get necessary approvals and create design doc for the release.
- Estimate the tasks and discuss with the clients on Risks/Issues.
- Working on the specific module independently and test the application. Code reviews suggest the team on best practices.
- Create necessary documentation, give presentations to clients and get necessary approvals.
- Broad knowledge of web standards relating to APIs (OAuth, SSL, CORS, JWT, etc.)
Your responsibilities as a backend engineer will include:
- Back-end software development
- Software engineering and designing data models and write effective APIs
- Working together with engineers and product teams
- Understanding business use cases and requirements for different internal teams
- Maintenance of existing projects and New feature development
- Consume and integrate classifier/ ML snippets from Data science team
What we are looking for:
- 4+ years of industry experience with the Python and Django framework.
- Degree in Computer Science or related field
- Good analytical skills with strong fundamentals of data structures and algorithms
- Experience building backend services with hands-on experience through all stages of Agile software development life cycle.
- Ability to write optimized codes,debug programs, and integrate applications with third party tools by developing various APIs
- Experience with Databases (Relational and Non-Relational). Ex: Cassandra, MongoDB, Postgresql
- Experience with writing REST-APIs.
- Prototyping initial collection and leveraging existing tools and/or creating new tools
- Experience working different types of datasets (e.g. unstructured, semi-structured, with missing information)
- Ability to think critically and creatively in a dynamic environment, while picking up new tools and domain knowledge along the way
- A positive attitude, and a growth mindset
Bonus:
- Experience with relevant Python libraries such as Sklearn, NLTK, tensorflow, HuggingFace Transformers
- Hands on experience in Machine learning implementations
- Experience with Cloud infrastructure (e.g. AWS) and relevant microservices
- Good with Humor and Team player
Desired candidates must have 3-7 years of experience as NodeJs Developer.
If the candidate cannot relocate to Gurgaon, we can also provide permanent Work from home for this position.
Roles and responsibilities:
- Responsible for understanding functional and business requirements and translate them into effective code
- Provide support till deployment of code into production.
- Ownership for ensuring code optimization, problem diagnosis, and on-time delivery
- Implement solutions as per the pre-defined framework /guidelines and adherence to processes
- Finding an optimal solution for the problem statement
- Conduct peer code review.
What candidate should know about:
- Excellent hands-on experience with Node.Js, Express.Js, JavaScript
- Understanding the nature of asynchronous programming and its quirks and workarounds
- Excellent hands-on experience with MongoDB, Mongo aggregation, MySQL
- Ability to build REST services, Authentications, MVC applications
- Excellent Object Oriented Programming skills and ability to write modular, secure, scalable, and maintainable code
- Experience with Elastic Search, Redis.
- Knowledge about AWS components (S3, EC2, Cloudfront, Redis Clusters, etc.)
- Self-learning abilities are required
- Familiarity with upcoming new technologies is a strong plus

My client is a US based Product development company.
Responsibilities:
- Identify complex business problems and work towards building analytical solutions in-order to create large business impact.
- Demonstrate leadership through innovation in software and data products from ideation/conception through design, development and ongoing enhancement, leveraging user research techniques, traditional data tools, and techniques from the data science toolkit such as predictive modelling, NLP, statistical analysis, vector space modelling, machine learning etc.
- Collaborate and ideate with cross-functional teams to identify strategic questions for the business that can be solved and champion the effectiveness of utilizing data, analytics, and insights to shape business.
- Contribute to company growth efforts, increasing revenue and supporting other key business outcomes using analytics techniques.
- Focus on driving operational efficiencies by use of data and analytics to impact cost and employee efficiency.
- Baseline current analytics capability, ensure optimum utilization and continued advancement to stay abridge with industry developments.
- Establish self as a strategic partner with stakeholders, focused on full innovation system and fully supportive of initiatives from early stages to activation.
- Review stakeholder objectives and team's recommendations to ensure alignment and understanding.
- Drive analytics thought leadership and effectively contributes towards transformational initiatives.
- Ensure accuracy of data and deliverables of reporting employees with comprehensive policies and processes.
Required Nodejs,Mongodb,java,reatjs, Java,Spring
2) Identifying potential business source/partners for the growth of the firm
3) Handling Initial client interaction/Understanding clients requirement
4) Lead Generation




