11+ DMS Jobs in Bangalore (Bengaluru) | DMS Job openings in Bangalore (Bengaluru)
Apply to 11+ DMS Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest DMS Job opportunities across top companies like Google, Amazon & Adobe.


AWS Glue Developer
Work Experience: 6 to 8 Years
Work Location: Noida, Bangalore, Chennai & Hyderabad
Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops,
Job Reference ID:BT/F21/IND
Job Description:
Design, build and configure applications to meet business process and application requirements.
Responsibilities:
7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.
Technical Experience:
Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.
➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.
➢ Create data pipeline architecture by designing and implementing data ingestion solutions.
➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.
➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.
➢ Author ETL processes using Python, Pyspark.
➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.
➢ ETL process monitoring using CloudWatch events.
➢ You will be working in collaboration with other teams. Good communication must.
➢ Must have experience in using AWS services API, AWS CLI and SDK
Professional Attributes:
➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.
➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.
➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.
Qualification:
➢ Degree in Computer Science, Computer Engineering or equivalent.
Salary: Commensurate with experience and demonstrated competence

Role & Responsibilities
work with peers in Product, QA, and other Engineering departments;
coach and mentor team members;
cautiously drive adoption of new technologies and processes;
preserve our engineering values of quality, scalability, and maintainability;
“see around corners” — identify blind spots and prioritize work across teams;
work with international teams to ensure successful product development and delivery; and
own the overall architecture and systems engineering for your products.
Job Description: We are looking for a talented and motivated Software Engineer with
expertise in both Windows and Linux operating systems and solid experience in Java
technologies. The ideal candidate should be proficient in data structures and algorithms, as
well as frameworks like Spring MVC, Spring Boot, and Hibernate. Hands-on experience
working with MySQL databases is also essential for this role.
Responsibilities:
● Design, develop, test, and maintain software applications using Java technologies.
● Implement robust solutions using Spring MVC, Spring Boot, and Hibernate frameworks.
● Develop and optimize database operations with MySQL.
● Analyze and solve complex problems by applying knowledge of data structures and
algorithms.
● Work with both Windows and Linux environments to develop and deploy solutions.
● Collaborate with cross-functional teams to deliver high-quality products on time.
● Ensure application security, performance, and scalability.
● Maintain thorough documentation of technical solutions and processes.
● Debug, troubleshoot, and upgrade legacy systems when required.
Requirements:
● Operating Systems: Expertise in Windows and Linux environments.
● Programming Languages & Technologies: Strong knowledge of Java (Core Java, Java 8+).
● Frameworks: Proficiency in Spring MVC, Spring Boot, and Hibernate.
● Algorithms and Data Structures: Good understanding and practical application of DSA
concepts.
● Databases: Experience with MySQL – writing queries, stored procedures, and performance
tuning.
● Version Control Systems: Experience with tools like Git.
● Deployment: Knowledge of CI/CD pipelines and tools such as Jenkins, Docker (optional)
- Tele Marketing Executives:
Responsibilities:
- Reach out to potential clients via phone calls to introduce our services.
- Schedule appointments or follow-ups for the sales team.
- Maintain accurate records of calls and customer details.
Requirements:
- Strong communication skills with a pleasant telephone manner.
- Previous experience in telemarketing or customer service preferred.
- Goal-oriented and self-motivated.
- Ability to handle rejection positively and persistently.


Job Title – Data Scientist (Forecasting)
Anicca Data is seeking a Data Scientist (Forecasting) who is motivated to apply his/her/their skill set to solve complex and challenging problems. The focus of the role will center around applying deep learning models to real-world applications. The candidate should have experience in training, testing deep learning architectures. This candidate is expected to work on existing codebases or write an optimized codebase at Anicca Data. The ideal addition to our team is self-motivated, highly organized, and a team player who thrives in a fast-paced environment with the ability to learn quickly and work independently.
Job Location: Remote (for time being) and Bangalore, India (post-COVID crisis)
Required Skills:
- At least 3+ years of experience in a Data Scientist role
- Bachelor's/Master’s degree in Computer Science, Engineering, Statistics, Mathematics, or similar quantitative discipline. D. will add merit to the application process
- Experience with large data sets, big data, and analytics
- Exposure to statistical modeling, forecasting, and machine learning. Deep theoretical and practical knowledge of deep learning, machine learning, statistics, probability, time series forecasting
- Training Machine Learning (ML) algorithms in areas of forecasting and prediction
- Experience in developing and deploying machine learning solutions in a cloud environment (AWS, Azure, Google Cloud) for production systems
- Research and enhance existing in-house, open-source models, integrate innovative techniques, or create new algorithms to solve complex business problems
- Experience in translating business needs into problem statements, prototypes, and minimum viable products
- Experience managing complex projects including scoping, requirements gathering, resource estimations, sprint planning, and management of internal and external communication and resources
- Write C++ and Python code along with TensorFlow, PyTorch to build and enhance the platform that is used for training ML models
Preferred Experience
- Worked on forecasting projects – both classical and ML models
- Experience with training time series forecasting methods like Moving Average (MA) and Autoregressive Integrated Moving Average (ARIMA) with Neural Networks (NN) models as Feed-forward NN and Nonlinear Autoregressive
- Strong background in forecasting accuracy drivers
- Experience in Advanced Analytics techniques such as regression, classification, and clustering
- Ability to explain complex topics in simple terms, ability to explain use cases and tell stories
5+ Yrs
- Should have strong experience in Salesforce Development.
- Knowledge in Lightning.
- Hands-on with Visual force, Apex.
- SFDC Experience with focus on Apex, Triggers and Visualforce Development.
- Experience enhancing & debugging existing Apex and Visualforce codebase.
- Should have good experience in Salesforce configuration & customization.
- Must have SFDC knowledge including but not limited to Workflows, Validations, Approval Process etc Security Model, Data Model
- Must have Hands-on Customization Apex, Visual Force, Triggers, Batch and Schedule Apex
- Must have Awareness of Application Migration Tools/Methods
- Must have Awareness of Data Migration Apex Data Loader
- Good Communication skills required.
You will pitch about PagarBook desktop solution on call to customers.
You will explain the benefits of PagarBook desktop solution to customers.
Better and Easier accessibility, also available on mobile web.
Minimum 100 calls to be made
Sourcing new sales opportunities through inbound lead follow-up and outbound cold calls.
Understanding customer needs and requirements, Pitching the product accordingly based on the clients need.
Routing qualified opportunities accordingly for further development and closure.
Access to rich reports which gives you business knowledge.
Unlimited free upgrades for a year.
Bulk update features.
Expense management.
The users can register on the desktop for a free trial for 7 days first.
Sales associates would convert the customer into a paid customer.

Role - Strong Experts in C++11/C++14 (Embedded Linux)
About GloballLogic - www.globallogic.com
Experience - 5 to 18 years
Location: Bangalore, India
Must have Key Skills
Strong in Embedded Linux system experience
Strong, C++ 11/14 programming OOPS, OOAD, Design Pattern
Linux, ADS pipelines, STL
Embedded systems experience
Proc filesystem
Socket programming
Memory management in Linux.
debug memory
Threads - synchronization
Linux IPC - sockets, accept system call
C++ STL container
Virtual functions - vptr, polymorphism.
Smart pointers
application development in multi-process/multi-thread environments using C++/C++11/C++14.
application/middleware development for consumer electronic devices.
ndk/sdk kit development
video domain experience
Linux system experience
Expertise in Linux System, Kernel level programming(Good to have)

We are looking for a Node.js Developer who is proficient with writing API's, working with data, using AWS and capable of applying algorithms mainly machine learning-based to solve problems and create/modify features for our students. Your primary focus will be the development of all server-side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end. You will also be responsible for integrating the front-end elements built by your co-workers into the application. Therefore, a basic understanding of front-end technologies is necessary as well.
Responsibilities
- Integration of user-facing elements developed by front-end developers with server-side logic
- Writing reusable, testable, and efficient code
- Design and implementation of low-latency, high-availability, and performant applications
- Implementation of security and data protection
- Use of algorithms to drive data analytics and features.
- Ability to use AWS to solve scale issues.
Apply if you can only arrive for a face to face interview in Bangalore.