Cutshort logo
Microsoft Dynamics AX Jobs in Delhi, NCR and Gurgaon

11+ Microsoft Dynamics AX Jobs in Delhi, NCR and Gurgaon | Microsoft Dynamics AX Job openings in Delhi, NCR and Gurgaon

Apply to 11+ Microsoft Dynamics AX Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest Microsoft Dynamics AX Job opportunities across top companies like Google, Amazon & Adobe.

icon
Remote only
4 - 9 yrs
₹7L - ₹11L / yr
Customer Relationship Management (CRM)
dynamics crm
Microsoft Dynamics CRM
crm sdk
Microsoft Dynamics
+1 more
Roles and Responsibility

1. Minimum 4-9 years of customization and implementation of Microsoft Dynamics CRM

2. CRM development experience:

3. Plug-in development

4. Custom workflow assembly development

5. Data migration and integration using Dynamics CRM tools.

6. Using web resources to communicate with MS CRM services

7. CRM form programming (Java scripting, Ribbon Customization)

8. Custom page integration (embedding custom ASP.NET pages into CRM and using JavaScript to integrate with CRM forms)

9. Generating Reports using SSRS

10. C#, .NET, JavaScript

11. CRM SDK and exposure to Web services.

12. Should have worked on Dynamics CRM on Premise and Dynamics CRM Online version

13. Should have experience in Integrating Dynamics CRM with ERP like D365 AX or D365 F n O, as well as other applications using Web services

Skills and Attributes-

1. Should be ready to work as a Solution architect for big implementations

2. Communication and Writing skills.

3. Should be able to work as a team lead/ technical lead, mentoring developer and solving their technical issues.

4. Should be ready to travel onsite as needed.

Qualification-
B tech
Read more
Fintech lead,
Agency job
via The Hub by Sridevi Viswanathan
Gurugram, Noida
3 - 8 yrs
₹5L - ₹15L / yr
Natural Language Processing (NLP)
BERT
skill iconMachine Learning (ML)
skill iconData Science
skill iconPython
+1 more

Who we are looking for

· A Natural Language Processing (NLP) expert with strong computer science fundamentals and experience in working with deep learning frameworks. You will be working at the cutting edge of NLP and Machine Learning.

Roles and Responsibilities

· Work as part of a distributed team to research, build and deploy Machine Learning models for NLP.

· Mentor and coach other team members

· Evaluate the performance of NLP models and ideate on how they can be improved

· Support internal and external NLP-facing APIs

· Keep up to date on current research around NLP, Machine Learning and Deep Learning

Mandatory Requirements

·       Any graduation with at least 2 years of demonstrated experience as a Data Scientist.

Behavioural Skills

· Strong analytical and problem-solving capabilities.

· Proven ability to multi-task and deliver results within tight time frames

· Must have strong verbal and written communication skills

· Strong listening skills and eagerness to learn

· Strong attention to detail and the ability to work efficiently in a team as well as individually

Technical Skills

Hands-on experience with

· NLP

· Deep Learning

· Machine Learning

· Python

· Bert

Preferred Requirements

· Experience in Computer Vision is preferred

Role: Data Scientist

Industry Type: Banking

Department: Data Science & Analytics

Employment Type: Full Time, Permanent

Role Category: Data Science & Machine Learning

Read more
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram
5 - 10 yrs
₹14L - ₹15L / yr
Google Cloud Platform (GCP)
Spark
PySpark
Apache Spark
"DATA STREAMING"

Data Engineering : Senior Engineer / Manager


As Senior Engineer/ Manager in Data Engineering, you will translate client requirements into technical design, and implement components for a data engineering solutions. Utilize a deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution.


Must Have skills :


1. GCP


2. Spark streaming : Live data streaming experience is desired.


3. Any 1 coding language: Java/Pyhton /Scala



Skills & Experience :


- Overall experience of MINIMUM 5+ years with Minimum 4 years of relevant experience in Big Data technologies


- Hands-on experience with the Hadoop stack - HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.


- Strong experience in at least of the programming language Java, Scala, Python. Java preferable


- Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc.


- Well-versed and working knowledge with data platform related services on GCP


- Bachelor's degree and year of work experience of 6 to 12 years or any combination of education, training and/or experience that demonstrates the ability to perform the duties of the position


Your Impact :


- Data Ingestion, Integration and Transformation


- Data Storage and Computation Frameworks, Performance Optimizations


- Analytics & Visualizations


- Infrastructure & Cloud Computing


- Data Management Platforms


- Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time


- Build functionality for data analytics, search and aggregation

Read more
Rezo.AI
Aditya Deo
Posted by Aditya Deo
Noida
2 - 6 yrs
₹10L - ₹24L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm

About Us

We are an AI-Powered CX Cloud that enables enterprises to transform customer experience and boost revenue with our APIs by automating and analyzing customer interactions at scale. We assist across multiple voices and non-voice channels in 30+ languages whilst coaching and training agents with minimal costs.

 

The problem we are solving

In comparison to worldwide norms, customer support in traditional contact centers is quite appalling, due to a high number of queries, insufficient capacity of agents and inane customer support systems, businesses struggle with a multi-fold rise in customer discontent and bounce rate, resulting in connectivity failure points between them and customers. To address this issue, IITian couple Manish and Rashi Gupta founded Rezo's AI-Powered CX Cloud for Enterprises 2018 to help businesses avoid customer churn and boost revenue without incurring financial costs by providing 24x7 real-time responses to customer inquiries with minimal human interaction

 

Roles and Responsibilities :

  • Speech Recognition model development across multiple languages.
  • Solve critical real-world scenarios - Noisy channel ASR performance, Multi speaker detection, etc.
  • Implement and deliver PoC's /UATs products on the Rezo platform.
  • Responsible for product performance, robustness and reliability.

Requirements:

  • 2+ years Experience with Bachelors's/Master degree with a focus on CS, Machine Learning, and Signal Processing.
  • Strong knowledge of various ML concepts/algorithms and hands-on experience in relevant projects.
  • Experience in machine learning platforms such as TensorFlow, and Pytorch and solid programming development skills (Python, C, C++ etc).
  • Ability to learn new tools, languages and frameworks quickly.
  • Familiarity with databases, data transformation techniques, and ability to work with unstructured data like OCR/ speech/text data.
  • Previous experience with working in Conversational AI is a plus.
  • Git portfolios will be helpful.

Life at Rezo.AI

  • We take transparency very seriously. Along with a full view of team goals, get a top-level view across the board with our regular town hall meetings.
  • A highly inclusive work culture that promotes a relaxed, creative, and productive environment.
  • Practice autonomy, open communication, and growth opportunities, while maintaining a perfect work-life balance.
  • Go on company-sponsored offsites, and blow off steam with your work buddies.

 

Perks & Benefits

Learning is a way of life. Unlock your full potential backed with cutting-edge tools and mentor-ship

Get the best in class medical insurance, programs for taking care of your mental health, and a Contemporary Leave Policy (beyond sick leaves)

 

Why Us?

We are a fast-paced start-up with some of the best talents from diverse backgrounds. Working together to solve customer service problems. We believe a diverse workforce is a powerful multiplier of innovation and growth, which is key to providing our clients with the best possible service and our employees with the best possible career. Diversity makes us smarter, more competitive, and more innovative.

 

Explore more here

http://www.rezo.ai/">www.rezo.ai

Read more
Celebal Technologies

at Celebal Technologies

2 recruiters
Payal Hasnani
Posted by Payal Hasnani
Jaipur, Noida, Gurugram, Delhi, Ghaziabad, Faridabad, Pune, Mumbai
5 - 15 yrs
₹7L - ₹25L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more
Job Responsibilities:

• Project Planning and Management
o Take end-to-end ownership of multiple projects / project tracks
o Create and maintain project plans and other related documentation for project
objectives, scope, schedule and delivery milestones
o Lead and participate across all the phases of software engineering, right from
requirements gathering to GO LIVE
o Lead internal team meetings on solution architecture, effort estimation, manpower
planning and resource (software/hardware/licensing) planning
o Manage RIDA (Risks, Impediments, Dependencies, Assumptions) for projects by
developing effective mitigation plans
• Team Management
o Act as the Scrum Master
o Conduct SCRUM ceremonies like Sprint Planning, Daily Standup, Sprint Retrospective
o Set clear objectives for the project and roles/responsibilities for each team member
o Train and mentor the team on their job responsibilities and SCRUM principles
o Make the team accountable for their tasks and help the team in achieving them
o Identify the requirements and come up with a plan for Skill Development for all team
members
• Communication
o Be the Single Point of Contact for the client in terms of day-to-day communication
o Periodically communicate project status to all the stakeholders (internal/external)
• Process Management and Improvement
o Create and document processes across all disciplines of software engineering
o Identify gaps and continuously improve processes within the team
o Encourage team members to contribute towards process improvement
o Develop a culture of quality and efficiency within the team

Must have:
• Minimum 08 years of experience (hands-on as well as leadership) in software / data engineering
across multiple job functions like Business Analysis, Development, Solutioning, QA, DevOps and
Project Management
• Hands-on as well as leadership experience in Big Data Engineering projects
• Experience developing or managing cloud solutions using Azure or other cloud provider
• Demonstrable knowledge on Hadoop, Hive, Spark, NoSQL DBs, SQL, Data Warehousing, ETL/ELT,
DevOps tools
• Strong project management and communication skills
• Strong analytical and problem-solving skills
• Strong systems level critical thinking skills
• Strong collaboration and influencing skills

Good to have:
• Knowledge on PySpark, Azure Data Factory, Azure Data Lake Storage, Synapse Dedicated SQL
Pool, Databricks, PowerBI, Machine Learning, Cloud Infrastructure
• Background in BFSI with focus on core banking
• Willingness to travel

Work Environment
• Customer Office (Mumbai) / Remote Work

Education
• UG: B. Tech - Computers / B. E. – Computers / BCA / B.Sc. Computer Science
Read more
HCL Technologies

at HCL Technologies

3 recruiters
Agency job
via Saiva System by Sunny Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Hyderabad, Chennai, Pune, Mumbai, Kolkata
5 - 10 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more
Exp- 5 + years
Skill- Spark and Scala along with Azure
Location - Pan India

Looking for someone Bigdata along with Azure
Read more
Bengaluru (Bangalore), Mumbai, Gurugram, Nashik, Pune, Visakhapatnam, Chennai, Noida
3 - 5 yrs
₹8L - ₹12L / yr
Oracle Analytics
OAS
OAC
Oracle OAS
Oracle
+8 more

Oracle OAS Developer

 

 

Senior OAS/OAC (Oracle analytics) designer and developer having 3+ years of experience. Worked on new Oracle Analytics platform. Used latest features, custom plug ins and design new one using Java. Has good understanding about the various graphs data points and usage for appropriate financial data display. Worked on performance tuning and build complex data security requirements.

Qualifications



Bachelor university degree in Engineering/Computer Science.

Additional information

Have knowledge of Financial and HR dashboard

 

Read more
Falcon Autotech

at Falcon Autotech

1 recruiter
Rohit Kaushik
Posted by Rohit Kaushik
Noida
3 - 7 yrs
₹4L - ₹7L / yr
skill iconData Analytics
Data Analyst
Tableau
MySQL
SQL
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Expertise of SQL/PL-SQL -ability to write procedures and create queries for reporting purpose.
  • Must have worked on a reporting tool – Power BI/Tableau etc.
  • Strong knowledge of excel/Google Sheets – must have worked with pivot tables, aggregate functions, logical if conditions.
  • Strong verbal and written communication skills for coordination with departments.
  • An analytical mind and inclination for problem-solving
Read more
PAGO Analytics India Pvt Ltd
Vijay Cheripally
Posted by Vijay Cheripally
Remote, Bengaluru (Bangalore), Mumbai, NCR (Delhi | Gurgaon | Noida)
2 - 8 yrs
₹8L - ₹15L / yr
skill iconPython
PySpark
Microsoft Windows Azure
SQL Azure
skill iconData Analytics
+6 more
Be an integral part of large scale client business development and delivery engagements
Develop the software and systems needed for end-to-end execution on large projects
Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
Build the knowledge base required to deliver increasingly complex technology projects


Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) and frameworks (e.g. J2EE or .NET)
Database programming using any flavours of SQL
Expertise in relational and dimensional modelling, including big data technologies
Exposure across all the SDLC process, including testing and deployment
Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc.
Good knowledge of Python and Spark are required
Good understanding of how to enable analytics using cloud technology and ML Ops
Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus
Read more
Remote, NCR (Delhi | Gurgaon | Noida)
3 - 12 yrs
₹8L - ₹14L / yr
Data Warehouse (DWH)
ETL
Amazon Redshift

Responsible for planning, connecting, designing, scheduling, and deploying data warehouse systems. Develops, monitors, and maintains ETL processes, reporting applications, and data warehouse design.

Role and Responsibility

·         Plan, create, coordinate, and deploy data warehouses.

·         Design end user interface.

·         Create best practices for data loading and extraction.

·         Develop data architecture, data modeling, and ETFL mapping solutions within structured data warehouse environment.

·         Develop reporting applications and data warehouse consistency.

·         Facilitate requirements gathering using expert listening skills and develop unique simple solutions to meet the immediate and long-term needs of business customers.

·         Supervise design throughout implementation process.

·         Design and build cubes while performing custom scripts.

·         Develop and implement ETL routines according to the DWH design and architecture.

·         Support the development and validation required through the lifecycle of the DWH and Business Intelligence systems, maintain user connectivity, and provide adequate security for data warehouse.

·         Monitor the DWH and BI systems performance and integrity provide corrective and preventative maintenance as required.

·         Manage multiple projects at once.

DESIRABLE SKILL SET

·         Experience with technologies such as MySQL, MongoDB, SQL Server 2008, as well as with newer ones like SSIS and stored procedures

·         Exceptional experience developing codes, testing for quality assurance, administering RDBMS, and monitoring of database

·         High proficiency in dimensional modeling techniques and their applications

·         Strong analytical, consultative, and communication skills; as well as the ability to make good judgment and work with both technical and business personnel

·         Several years working experience with Tableau,  MicroStrategy, Information Builders, and other reporting and analytical tools

·         Working knowledge of SAS and R code used in data processing and modeling tasks

·         Strong experience with Hadoop, Impala, Pig, Hive, YARN, and other “big data” technologies such as AWS Redshift or Google Big Data

 

Read more
Precily Private Limited

at Precily Private Limited

5 recruiters
Bharath Rao
Posted by Bharath Rao
NCR (Delhi | Gurgaon | Noida)
1 - 3 yrs
₹3L - ₹9L / yr
skill iconData Science
Artificial Neural Network (ANN)
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
+3 more
-Precily AI: Automatic summarization, shortening a business document, book with our AI. Create a summary of the major points of the original document. AI can make a coherent summary taking into account variables such as length, writing style, and syntax. We're also working in the legal domain to reduce the high number of pending cases in India. We use Artificial Intelligence and Machine Learning capabilities such as NLP, Neural Networks in Processing the data to provide solutions for various industries such as Enterprise, Healthcare, Legal.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort