Cutshort logo
Acuity Knowledge Partners's logo

Data Engineer

Gangadhar S's profile picture
Posted by Gangadhar S
4 - 9 yrs
₹16L - ₹40L / yr
Bengaluru (Bangalore)
Skills
skill iconPython
skill iconAmazon Web Services (AWS)
CI/CD
skill iconMongoDB
MLOps
Software deployment

Job Responsibilities:

1. Develop/debug applications using Python.

2. Improve code quality and code coverage for existing or new program.

3. Deploy and Integrate the Machine Learning models.

4. Test and validate the deployments.

5. ML Ops function.


Technical Skills

1. Graduate in Engineering or Technology with strong academic credentials

2. 4 to 8 years of experience as a Python developer.

3. Excellent understanding of SDLC processes

4. Strong knowledge of Unit testing, code quality improvement

5. Cloud based deployment and integration of applications/micro services.

6. Experience with NoSQL databases, such as MongoDB, Cassandra

7. Strong applied statistics skills

8. Knowledge of creating CI/CD pipelines and touchless deployment.

9. Knowledge about API, Data Engineering techniques.

10. AWS

11. Knowledge of Machine Learning and Large Language Model.


Nice to Have

1. Exposure to financial research domain

2. Experience with JIRA, Confluence

3. Understanding of scrum and Agile methodologies

4. Experience with data visualization tools, such as Grafana, GGplot, etc

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Acuity Knowledge Partners

Founded :
2019
Type
Size :
5000+
Stage :
Profitable
About

Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 500 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide.

We EMPOWER our clients to drive revenues higher. We INNOVATE using our proprietary technology and automation solutions. Finally, we enable our clients to TRANSFORM their operating model and cost base.

Read more
Company video
Candid answers by the company
What does the company do?
What is the location preference of jobs?

We EMPOWER our clients to drive revenues higher. We INNOVATE using our proprietary technology and automation solutions. Finally, we enable our clients to TRANSFORM their operating model and cost base.

Company social profiles
bloginstagramlinkedintwitterfacebook

Similar jobs

Fullness Web Solutions
at Fullness Web Solutions
2 candid answers
Vidhu Bajaj
Posted by Vidhu Bajaj
Pune
2 - 3 yrs
₹6L - ₹8L / yr
skill iconPython
AWS Lambda

Hiring alert 🚨


Calling all #PythonDevelopers looking for an #ExcitingJobOpportunity 🚀 with one of our #Insurtech clients.


Are you a Junior Python Developer eager to grow your skills in #BackEnd development?


Our company is looking for someone like you to join our dynamic team. If you're passionate about Python and ready to learn from seasoned developers, this role is for you!


📣 About the company


The client is a fast-growing consultancy firm, helping P&C Insurance companies on their digital journey. With offices in Mumbai and New York, they're at the forefront of insurance tech. Plus, they offer a hybrid work culture with flexible timings, typically between 9 to 5, to accommodate your work-life balance.


💡 What you’ll do


📌 Work with other developers.

📌 Implement Python code with assistance from senior developers.

📌 Write effective test cases such as unit tests to ensure it is meeting the software design requirements.

📌 Ensure Python code when executed is efficient and well written.

📌 Refactor old Python code to ensure it follows modern principles.

📌 Liaise with stakeholders to understand the requirements.

📌 Ensure integration can take place with front end systems.

📌 Identify and fix code where bugs have been identified.


🔎 What you’ll need


📌 Minimum 3 years of experience writing AWS Lambda using Python

📌 Knowledge of other AWS services like CloudWatch and API Gateway

📌 Fundamental understanding of Python and its frameworks.

📌 Ability to write simple SQL queries

📌 Familiarity with AWS Lambda deployment

📌 The ability to problem-solve.

📌 Fast learner with an ability to adapt techniques based on requirements.

📌 Knowledge of how to effectively test Python code.

📌 Great communication and collaboration skills.

 

Read more
SynRadar
at SynRadar
1 video
2 recruiters
Ashish Rao
Posted by Ashish Rao
Mumbai
0 - 3 yrs
₹5L - ₹10L / yr
skill iconAmazon Web Services (AWS)
skill iconDocker
skill iconPython
skill iconMongoDB
Web API
+2 more

This profile will include the following responsibilities:

 

- Develop Parsers for XML and JSON Data sources/feeds

- Write Automation Scripts for product development

- Build API Integrations for 3rd Party product integration

- Perform Data Analysis

- Research on Machine learning algorithms

- Understand AWS cloud architecture and work with 3 party vendors for deployments

- Resolve issues in AWS environment

We are looking for candidates with:
Qualification: BE/BTech/Bsc-IT/MCA
Programming Language: Python
Web Development: Basic understanding of Web Development. Working knowledge of Python Flask is desirable
Database & Platform: AWS/Docker/MySQL/MongoDB
Basic Understanding of Machine Learning Models & AWS Fundamentals is recommended.
Read more
nymbleUP
Remote, Mumbai
3 - 5 yrs
₹6L - ₹12L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
NumPy
Keras
+3 more
ML / AI  engineer with hands-on experience of working with Time Series Data, layering of data and adopting complex parameters. At least 3-5 years of experience of working with customer data and handling ETL operations. Experience of converting machine learning models into APIs

Responsibilities

  1. Create data funnels to feed into models via web, structured and unstructured data
  2. Maintain coding standards using  SDLC, Git, AWS deployments etc
  3. Keep abreast of developments in the field
  4. Deploy models in production and monitor them
  5. Documentations of processes and logic
  6. Take ownership of the solution from code to deployment and performance

 

Read more
Bengaluru (Bangalore), Ahmedabad, NCR (Delhi | Gurgaon | Noida), Mumbai
3 - 9 yrs
₹10L - ₹18L / yr
Adobe Experience Manager (AEM)
Adobe
Servlets
JSP
skill iconHTML/CSS
+18 more

Operates in over 25 countries across six continents and is part of Publicis Media, one of four solution hubs within Publicis Groupe, which is present in over 100 countries and employs nearly 80,000 professionals.

We  believe there are better ways for brands to connect with people. And we’re on a mission to guide brands to better connections -- across consumers, channels and partners. These are just some of the services we offer our clients in our quest to deliver ambitious outcomes.

 

Skills Required:

  • Servlet and JSP development
  • CSS, JavaScript, HTML
  • AJAX, jQuery, EXTJS
  • OSGi/FELIX
  • Web services creation and consumption
  • CMS development experience
  • Java Content Repository (JCR)/CRX
  • Eclipse IDE
  • Maven
  • SVN
  • Jenkins
  • Artifactory
  • Apache Sling
  • Lucene
  • Tomcat/JBoss
  • Apache Web Server
Read more
Srijan Technologies
at Srijan Technologies
6 recruiters
PriyaSaini
Posted by PriyaSaini
Remote only
3 - 8 yrs
₹5L - ₹12L / yr
skill iconData Analytics
Data modeling
skill iconPython
PySpark
ETL
+3 more

Role Description:

  • You will be part of the data delivery team and will have the opportunity to develop a deep understanding of the domain/function.
  • You will design and drive the work plan for the optimization/automation and standardization of the processes incorporating best practices to achieve efficiency gains.
  • You will run data engineering pipelines, link raw client data with data model, conduct data assessment, perform data quality checks, and transform data using ETL tools.
  • You will perform data transformations, modeling, and validation activities, as well as configure applications to the client context. You will also develop scripts to validate, transform, and load raw data using programming languages such as Python and / or PySpark.
  • In this role, you will determine database structural requirements by analyzing client operations, applications, and programming.
  • You will develop cross-site relationships to enhance idea generation, and manage stakeholders.
  • Lastly, you will collaborate with the team to support ongoing business processes by delivering high-quality end products on-time and perform quality checks wherever required.

Job Requirement:

  • Bachelor’s degree in Engineering or Computer Science; Master’s degree is a plus
  • 3+ years of professional work experience with a reputed analytics firm
  • Expertise in handling large amount of data through Python or PySpark
  • Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools
  • Experience of deploying ETL / data pipelines and workflows in cloud technologies and architecture such as Azure and Amazon Web Services will be valued
  • Comfort with data modelling principles (e.g. database structure, entity relationships, UID etc.) and software development principles (e.g. modularization, testing, refactoring, etc.)
  • A thoughtful and comfortable communicator (verbal and written) with the ability to facilitate discussions and conduct training
  • Strong problem-solving, requirement gathering, and leading.
  • Track record of completing projects successfully on time, within budget and as per scope

Read more
Remote, Bengaluru (Bangalore), Mumbai, Pune, Hyderabad
12 - 30 yrs
₹5L - ₹15L / yr
skill iconJava
skill iconNodeJS (Node.js)
skill iconPython
skill iconReact.js
Job Description – Solutions Architect
A Solutions Architect is responsible for validating the logical models, ensuring standards, driving consolidation of redundant data, and enforcing the strategic vision through data models. The Architect role has an in-depth understanding of both our business capabilities and how it aligns to our enterprise data models. Partners with Enterprise Architecture to consult on and develop domain models. Consults with project teams and functional units on the design of important projects or services. consults with business leadership on the design of systems and projects. May consult with leadership on emerging technologies.
To be successful as a solution architect, you should be able to integrate any updated specifications and requirements into the systems architecture. An outstanding solution architect should be able to explain complex problems to management in layman’s terms.

Responsibilities:
Building and integrating information systems to meet the company’s needs.
Assessing the systems architecture currently in place and working with technical staff to recommend solutions to improve it.
E2E accountability of solution design across multiple products, integrations and technologies that deliver successful business outcomes which meet reliability, availability, serviceability needs.
Experience working with the latest emerging tehnolgies and Programming Languages like - Java, .Net, MERN Stack, MEAN Stack, Angular, React, VueJS, NodeJS, Block Chain, GoLang, ML, Data Science related areas, etc
Provide detailed specifications for proposed solutions
Resolving technical problems as they arise.
Providing supervision and guidance to development teams.
Continually researching current and emerging technologies and proposing changes where needed.
Informing various stakeholders about any problems with the current technical solutions being implemented.
Assessing the business impact that certain technical choices have.
Providing updates to stakeholders on product development processes, costs, and budgets.
Work closely with Information Technology professionals within the company to ensure hardware is available for projects and working properly
Propose and establish framework for necessary contributions from various departments
Account for possible project challenges on constraints including, risks, time, resources, and scope
Read more
Dataweave Pvt Ltd
at Dataweave Pvt Ltd
32 recruiters
Megha M
Posted by Megha M
Bengaluru (Bangalore)
0 - 1 yrs
Best in industry
Data engineering
Internship
skill iconPython
Looking for the Candiadtes , good in coding
scraping , and problem skills
Read more
Fint Solutions
at Fint Solutions
1 recruiter
Thallada Naveen
Posted by Thallada Naveen
Hyderabad
7 - 10 yrs
₹10L - ₹20L / yr
Data engineering
Data engineer
SSIS
Azure data factory
Data migration
+2 more
Job Locations: Hyderabad, Telangana, India
Required Experience: 5 - 7 Years
Skills : ADF, Azure, SSIS, python
Job Description
Azure Data Engineer with hands on SSIS migrations and ADF expertise.

Roles & Responsibilities
•Overall, 6+ years’ experience in Cloud Data Engineering, with hands on experience in ADF (Azure Data Factory) is required.
Hands on experience with SSIS to ADF migration is preferred.
SQL Server Integration Services (SSIS) workloads to SSIS in ADF. ( Must have done at least one migration)
Hands on experience implementing Azure Data Factory frameworks, scheduling, and performance tuning.
Hands on experience in migrating SSIS solutions to ADF
Hands on experience in ADF coding side.
Hands on experience with MPP Database architecture
Hands on experience in python
Read more
Aptus Data LAbs
at Aptus Data LAbs
1 recruiter
Merlin Metilda
Posted by Merlin Metilda
Bengaluru (Bangalore)
5 - 10 yrs
₹6L - ₹15L / yr
Data engineering
Big Data
Hadoop
Data Engineer
Apache Kafka
+5 more

Roles & Responsibilities

  1. Proven experience with deploying and tuning Open Source components into enterprise ready production tooling Experience with datacentre (Metal as a Service – MAAS) and cloud deployment technologies (AWS or GCP Architect certificates required)
  2. Deep understanding of Linux from kernel mechanisms through user space management
  3. Experience on CI/CD (Continuous Integrations and Deployment) system solutions (Jenkins).
  4. Using Monitoring tools (local and on public cloud platforms) Nagios, Prometheus, Sensu, ELK, Cloud Watch, Splunk, New Relic etc. to trigger instant alerts, reports and dashboards.  Work closely with the development and infrastructure teams to analyze and design solutions with four nines (99.99%) up-time, globally distributed, clustered, production and non-production virtualized infrastructure. 
  5. Wide understanding of IP networking as well as data centre infrastructure

Skills

  1. Expert with software development tools and sourcecode management, understanding, managing issues, code changes and grouping them into deployment releases in a stable and measurable way to maximize production Must be expert at developing and using ansible roles and configuring deployment templates with jinja2.
  2. Solid understanding of data collection tools like Flume, Filebeat, Metricbeat, JMX Exporter agents.
  3. Extensive experience operating and tuning the kafka streaming data platform, specifically as a message queue for big data processing
  4. Strong understanding and must have experience:
  5. Apache spark framework, specifically spark core and spark streaming, 
  6. Orchestration platforms, mesos and kubernetes, 
  7. Data storage platforms, elasticstack, carbon, clickhouse, cassandra, ceph, hdfs
  8. Core presentation technologies kibana, and grafana.
  9. Excellent scripting and programming skills (bash, python, java, go, rust). Must have previous experience with “rust” in order to support, improve in house developed products

Certification

Red Hat Certified Architect certificate or equivalent required CCNA certificate required 3-5 years of experience running open source big data platforms

Read more
Opscruise
at Opscruise
2 recruiters
sharmila M
Posted by sharmila M
Remote, Chennai
9 - 25 yrs
₹8L - ₹25L / yr
skill iconData Science
skill iconPython
skill iconMachine Learning (ML)
DA
Unsupervised learning
+1 more

Responsibilities

  • Research and test novel machine learning approaches for analysing large-scale distributed computing applications.
  • Develop production-ready implementations of proposed solutions across different models AI and ML algorithms, including testing on live customer data to improve accuracy,  efficacy, and robustness
  • Work closely with other functional teams to integrate implemented systems into the SaaS platform
  • Suggest innovative and creative concepts and ideas that would improve the overall platform

Qualifications

The ideal candidate must have the following qualifications:

  • 5 + years experience in practical implementation and deployment of large customer-facing ML based systems.
  • MS or M Tech (preferred) in applied mathematics/statistics;  CS or Engineering disciplines are acceptable but must have with strong quantitative and applied mathematical skills
  • In-depth working, beyond coursework, familiarity with classical and current ML techniques, both supervised and unsupervised learning techniques and algorithms
  • Implementation experiences and deep knowledge of Classification, Time Series Analysis, Pattern Recognition, Reinforcement Learning, Deep Learning, Dynamic Programming and Optimization
  • Experience in working on modeling graph structures related to spatiotemporal systems
  • Programming skills in Python is a must
  • Experience in developing and deploying on cloud (AWS or Google or Azure)
  • Good verbal and written communication skills
  • Familiarity with well-known ML frameworks such as Pandas, Keras, TensorFlow

 

Most importantly, you should be someone who is passionate about building new and innovative products that solve tough real-world problems.

Location

Chennai, India

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos