Cutshort logo
Deltacubes logo
Python Developer
Python Developer
Deltacubes's logo

Python Developer

Bavithra Kanniyappan's profile picture
Posted by Bavithra Kanniyappan
5 - 12 yrs
₹10L - ₹15L / yr
Remote only
Skills
skill iconPython
skill iconAmazon Web Services (AWS)
PySpark
skill iconScala
Data engineering
Big Data
Hadoop
Spark

Hiring - Python Developer Freelance Consultant (WFH-Remote)

Greetings from Deltacubes Technology!!

 

Skillset Required:

Python

Pyspark

AWS

Scala

 

Experience:

5+ years

 

Thanks

Bavithra

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Deltacubes

Founded :
2017
Type
Size
Stage :
Bootstrapped
About
We are a professionally managed Corporate Training & Recruitment consulting company focusing on the growing needs of recruiting IT professionals
Read more
Connect with the team
Profile picture
Anju K
Profile picture
Bavithra Kanniyappan
Profile picture
Ravi T
Profile picture
DeenaS S
Profile picture
navneet kaur
Profile picture
Monalisa Sharma
Company social profiles
instagramtwitter

Similar jobs

Bengaluru (Bangalore)
2 - 8 yrs
₹4L - ₹10L / yr
Data governance
Data security
skill iconData Analytics
Informatica
SQL
+4 more

Job Description

We are looking for a senior resource with Analyst skills and knowledge of IT projects, to support delivery of risk mitigation activities and automation in Aviva’s Global Finance Data Office. The successful candidate will bring structure to this new role in a developing team, with excellent communication, organisational and analytical skills. The Candidate will play the primary role of supporting data governance project/change activities. Candidates should be comfortable with ambiguity in a fast-paced and ever-changing environment. Preferred skills include knowledge of Data Governance, Informatica Axon, SQL, AWS. In our team, success is measured by results and we encourage flexible working where possible.

Key Responsibilities

  • Engage with stakeholders to drive delivery of the Finance Data Strategy
  • Support data governance project/change activities in Aviva’s Finance function.
  • Identify opportunities and implement Automations for enhanced performance of the Team

Required profile

  • Relevant work experience in at least one of the following: business/project analyst, project/change management and data analytics.
  • Proven track record of successful communication of analytical outcomes, including an ability to effectively communicate with both business and technical teams.
  • Ability to manage multiple, competing priorities and hold the team and stakeholders to account on progress.
  • Contribute, plan and execute end to end data governance framework.
  • Basic knowledge of IT systems/projects and the development lifecycle.
  • Experience gathering business requirements and reports.
  • Advanced experience of MS Excel data processing (VBA Macros).
  • Good communication

 

Additional Information

Degree in a quantitative or scientific field (e.g. Engineering, MBA Finance, Project Management) and/or experience in data governance/quality/privacy
Knowledge of Finance systems/processes
Experience in analysing large data sets using dedicated analytics tools

 

Designation – Assistant Manager TS

Location – Bangalore

Shift – 11 – 8 PM
Read more
AxionConnect Infosolutions Pvt Ltd
Shweta Sharma
Posted by Shweta Sharma
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur, Chennai
5.5 - 7 yrs
₹20L - ₹25L / yr
skill iconDjango
skill iconFlask
Snowflake
Snow flake schema
SQL
+4 more

Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur

Notice period: Immediate - 15 days

 

1.      Python Developer with Snowflake

 

Job Description :


  1. 5.5+ years of Strong Python Development Experience with Snowflake.
  2. Strong hands of experience with SQL ability to write complex queries.
  3. Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
  4.  Development of Data Analysis, Data Processing engines using Python
  5. Good Experience in Data Transformation using Python. 
  6.  Experience in Snowflake data load using Python.
  7.  Experience in creating user-defined functions in Snowflake.
  8.  Snowsql implementation.
  9.  Knowledge of query performance tuning will be added advantage.
  10. Good understanding of Datawarehouse (DWH) concepts.
  11.  Interpret/analyze business requirements & functional specification
  12.  Good to have DBT, FiveTran, and AWS Knowledge.
Read more
Ascendeum
at Ascendeum
3 recruiters
Sonali Jain
Posted by Sonali Jain
Remote only
1 - 3 yrs
₹6L - ₹9L / yr
skill iconPython
CI/CD
Storage & Networking
Data storage
  • Understand long-term and short-term business requirements to precision match it with the capabilities of different distributed storage and computing technologies from the plethora of options available in the ecosystem.

  • Create complex data processing pipelines

  • Design scalable implementations of the models developed by our Data Scientist.

  • Deploy data pipelines in production systems based on CICD practices

  • Create and maintain clear documentation on data models/schemas as well as

    transformation/validation rules

  • Troubleshoot and remediate data quality issues raised by pipeline alerts or downstream consumers

Read more
JK Technosoft Ltd
Nishu Gupta
Posted by Nishu Gupta
Bengaluru (Bangalore)
3 - 5 yrs
₹5L - ₹15L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
+13 more

Roles and Responsibilities:

  • Design, develop, and maintain the end-to-end MLOps infrastructure from the ground up, leveraging open-source systems across the entire MLOps landscape.
  • Creating pipelines for data ingestion, data transformation, building, testing, and deploying machine learning models, as well as monitoring and maintaining the performance of these models in production.
  • Managing the MLOps stack, including version control systems, continuous integration and deployment tools, containerization, orchestration, and monitoring systems.
  • Ensure that the MLOps stack is scalable, reliable, and secure.

Skills Required:

  • 3-6 years of MLOps experience
  • Preferably worked in the startup ecosystem

Primary Skills:

  • Experience with E2E MLOps systems like ClearML, Kubeflow, MLFlow etc.
  • Technical expertise in MLOps: Should have a deep understanding of the MLOps landscape and be able to leverage open-source systems to build scalable, reliable, and secure MLOps infrastructure.
  • Programming skills: Proficient in at least one programming language, such as Python, and have experience with data science libraries, such as TensorFlow, PyTorch, or Scikit-learn.
  • DevOps experience: Should have experience with DevOps tools and practices, such as Git, Docker, Kubernetes, and Jenkins.

Secondary Skills:

  • Version Control Systems (VCS) tools like Git and Subversion
  • Containerization technologies like Docker and Kubernetes
  • Cloud Platforms like AWS, Azure, and Google Cloud Platform
  • Data Preparation and Management tools like Apache Spark, Apache Hadoop, and SQL databases like PostgreSQL and MySQL
  • Machine Learning Frameworks like TensorFlow, PyTorch, and Scikit-learn
  • Monitoring and Logging tools like Prometheus, Grafana, and Elasticsearch
  • Continuous Integration and Continuous Deployment (CI/CD) tools like Jenkins, GitLab CI, and CircleCI
  • Explain ability and Interpretability tools like LIME and SHAP


Read more
Srijan Technologies
at Srijan Technologies
6 recruiters
Srijan Technologies
Posted by Srijan Technologies
Remote only
2 - 5 yrs
₹5L - ₹15L / yr
Big Data
Apache Kafka
Hadoop
Spark
Data engineering
+3 more
Job Description:-
We are looking for a Data Engineer, responsibilities include creating machine learning models and retraining systems. To do this job successfully, you need exceptional skills in statistics and programming. If you also have knowledge of data science and software engineering, your ultimate goal will be to shape and build efficient self-learning applications.


Technical Knowledge (Must Have)

  • Strong experience in SQL / HiveQL/ AWS Athena,
  • Strong expertise in the development of data pipelines (snaplogic is preferred).
  • Design, Development, Deployment and administration of data processing applications.
  • Good Exposure towards AWS and Azure Cloud computing environments.
  • Knowledge around BigData, AWS Cloud Architecture, Best practices, Securities, Governance, Metadata Management, Data Quality etc.
  • Data extraction through various firm sources (RDBMS, Unstructured Data Sources) and load to datalake with all best practices.
  • Knowledge in Python
  • Good knowledge in NoSQL technologies (Neo4J/ MongoDB)
  • Experience/knowledge in SnapLogic (ETL Technologies)
  • Working knowledge on Unix (AIX, Linux), shell scripting
  • Experience/knowledge in Data Modeling. Database Development
  • Experience/knowledge creation of reports and dashboards in Tableau/ PowerBI
Read more
Virtusa
at Virtusa
2 recruiters
Agency job
via Response Informatics by Anupama Lavanya Uppala
Chennai, Bengaluru (Bangalore), Mumbai, Hyderabad, Pune
3 - 10 yrs
₹10L - ₹25L / yr
PySpark
skill iconPython
  • Minimum 1 years of relevant experience, in PySpark (mandatory)
  • Hands on experience in development, test, deploy, maintain and improving data integration pipeline in AWS cloud environment is added plus 
  • Ability to play lead role and independently manage 3-5 member of Pyspark development team 
  • EMR ,Python and PYspark mandate.
  • Knowledge and awareness working with AWS Cloud technologies like Apache Spark, , Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS
Read more
Marktine
at Marktine
1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 7 yrs
₹5L - ₹10L / yr
Data Warehouse (DWH)
Spark
Data engineering
skill iconPython
PySpark
+5 more

Basic Qualifications

- Need to have a working knowledge of AWS Redshift.

- Minimum 1 year of designing and implementing a fully operational production-grade large-scale data solution on Snowflake Data Warehouse.

- 3 years of hands-on experience with building productized data ingestion and processing pipelines using Spark, Scala, Python

- 2 years of hands-on experience designing and implementing production-grade data warehousing solutions

- Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies

- Excellent presentation and communication skills, both written and verbal

- Ability to problem-solve and architect in an environment with unclear requirements

Read more
Discite Analytics Private Limited
Uma Sravya B
Posted by Uma Sravya B
Ahmedabad
4 - 7 yrs
₹12L - ₹20L / yr
Hadoop
Big Data
Data engineering
Spark
Apache Beam
+13 more
Responsibilities:
1. Communicate with the clients and understand their business requirements.
2. Build, train, and manage your own team of junior data engineers.
3. Assemble large, complex data sets that meet the client’s business requirements.
4. Identify, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, including the cloud.
6. Assist clients with data-related technical issues and support their data infrastructure requirements.
7. Work with data scientists and analytics experts to strive for greater functionality.

Skills required: (experience with at least most of these)
1. Experience with Big Data tools-Hadoop, Spark, Apache Beam, Kafka etc.
2. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
3. Experience in ETL and Data Warehousing.
4. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra etc.
5. Experience with cloud platforms like AWS, GCP and Azure.
6. Experience with workflow management using tools like Apache Airflow.
Read more
Venture Highway
at Venture Highway
3 recruiters
Nipun Gupta
Posted by Nipun Gupta
Bengaluru (Bangalore)
2 - 6 yrs
₹10L - ₹30L / yr
skill iconPython
Data engineering
Data Engineer
MySQL
skill iconMongoDB
+5 more
-Experience with Python and Data Scraping.
- Experience with relational SQL & NoSQL databases including MySQL & MongoDB.
- Familiar with the basic principles of distributed computing and data modeling.
- Experience with distributed data pipeline frameworks like Celery, Apache Airflow, etc.
- Experience with NLP and NER models is a bonus.
- Experience building reusable code and libraries for future use.
- Experience building REST APIs.

Preference for candidates working in tech product companies
Read more
Ganit Business Solutions
at Ganit Business Solutions
3 recruiters
Kavitha J
Posted by Kavitha J
Remote, Chennai, Bengaluru (Bangalore), Mumbai
3 - 6 yrs
₹12L - ₹20L / yr
skill iconData Science
Data Scientist
skill iconR Programming
skill iconPython
Predictive modelling
+3 more

Ganit Inc. is the fastest growing Data Science & AI company in Chennai.

Founded in 2017, by 3 industry experts who are alumnus of IITs/SPJIMR with each of them having 17+ years of experience in the field of analytics.

We are in the business of maximising Decision Making Power (DMP) for companies by providing solutions at the intersection of hypothesis based analytics, discovery based AI and IoT. Our solutions are a combination of customised services and functional product suite.

We primarily operate as a US-based start-up and have clients across US, Asia-Pacific, Middle-East and have offices in USA - New Jersey & India - Chennai.

 

Started with 3 people, the company is fast growing with 100+ employees

 

1. What do we expect from you

 

- Should posses minimum 2 years of experience of data analytics model development and deployment

- Skills relating to core Statistics & Mathematics.

- Huge interest in handling numbers

- Ability to understand all domains in businesses across various sectors

- Natural passion towards numbers, business, coding, visualisation

 

2. Necessary skill set:

 

- Proficient in R/Python, Advanced Excel, SQL

- Should have worked with Retail/FMCG/CPG projects solving analytical problems in Sales/Marketing/Supply Chain functions

- Very good understanding of algorithms, mathematical models, statistical techniques, data mining, like Regression models, Clustering/ Segmentation, time series forecasting, Decision trees/Random forest, etc.

- Ability to choose the right model for the right data and translate that into code in R, Python, VBA (Proven capabilities)

- Should have handled large datasets and with through understanding of SQL

- Ability to handle a team of Data Analysts

 

3. Good to have skill set:

 

- Microsoft PowerBI / Tableau / Qlik View / Spotfire

 

4. Job Responsibilities:

 

- Translate business requirements into technical requirements

- Data extraction, preparation and transformation

- Identify, develop and implement statistical techniques and algorithms that address business challenges and adds value to the organisation

- Create and implement data models

- Interact with clients for queries and delivery adoption

 

5. Screening Methodology

 

- Problem Solving round (Telephonic Conversation)

- Technical discussion round (Telephonic Conversation)

- Final fitment discussion (Video Round

 

 

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos