Cutshort logo
codersbrain logo
Python + Big Data Developer
Python + Big Data Developer
codersbrain's logo

Python + Big Data Developer

Aishwarya Hire's profile picture
Posted by Aishwarya Hire
4 - 6 yrs
₹8L - ₹10L / yr
Bengaluru (Bangalore)
Skills
PySpark
Data engineering
Big Data
Hadoop
Spark
skill iconPython
  • Design the architecture of our big data platform
  • Perform and oversee tasks such as writing scripts, calling APIs, web scraping, and writing SQL queries
  • Design and implement data stores that support the scalable processing and storage of our high-frequency data
  • Maintain our data pipeline
  • Customize and oversee integration tools, warehouses, databases, and analytical systems
  • Configure and provide availability for data-access tools used by all data scientists


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About codersbrain

Founded :
2015
Type
Size :
20-100
Stage :
Bootstrapped
About
N/A
Connect with the team
Profile picture
Shreya Trivedi
Company social profiles
twitterfacebook

Similar jobs

Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
ETL
Informatica
Data Warehouse (DWH)
PowerBI
databricks
+4 more

About The Company


 The client is 17-year-old Multinational Company headquartered in Bangalore, Whitefield, and having another delivery center in Pune, Hinjewadi. It also has offices in US and Germany and are working with several OEM’s and Product Companies in about 12 countries and is a 200+ strong team worldwide. 


The Role


Power BI front-end developer in the Data Domain (Manufacturing, Sales & Marketing, Purchasing, Logistics, …).Responsible for the Power BI front-end design, development, and delivery of highly visible data-driven applications in the Compressor Technique. You always take a quality-first approach where you ensure the data is visualized in a clear, accurate, and user-friendly manner. You always ensure standards and best practices are followed and ensure documentation is created and maintained. Where needed, you take initiative and make

recommendations to drive improvements. In this role you will also be involved in the tracking, monitoring and performance analysis

of production issues and the implementation of bugfixes and enhancements.


Skills & Experience


• The ideal candidate has a degree in Computer Science, Information Technology or equal through experience.

• Strong knowledge on BI development principles, time intelligence, functions, dimensional modeling and data visualization is required.

• Advanced knowledge and 5-10 years experience with professional BI development & data visualization is preferred.

• You are familiar with data warehouse concepts.

• Knowledge on MS Azure (data lake, databricks, SQL) is considered as a plus.

• Experience and knowledge on scripting languages such as PowerShell and Python to setup and automate Power BI platform related activities is an asset.

• Good knowledge (oral and written) of English is required.

Read more
Bengaluru (Bangalore)
2 - 6 yrs
₹25L - ₹45L / yr
Data engineering
skill iconData Analytics
Big Data
Apache Spark
airflow
+8 more
2+ years of experience in a Data Engineer role.
● Proficiency in Linux.
● Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
● Must have SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as familiarity with databases including Mysql, Mongo, Cassandra,
and Athena.
● Must have experience with Python/Scala.
● Must have experience with Big Data technologies like Apache Spark.
● Must have experience with Apache Airflow.
● Experience with data pipelines and ETL tools like AWS Glue.
Read more
TEKsystems
at TEKsystems
1 recruiter
priyanka kanwar
Posted by priyanka kanwar
Gurugram
5 - 10 yrs
₹15L - ₹25L / yr
Apache Spark
skill iconAmazon Web Services (AWS)
skill iconPython
airflow
Algorithms

TOP 3 SKILLS

Python (Language)

Spark Framework

Spark Streaming

Docker/Jenkins/ Spinakar

AWS

Hive Queries

He/She should be good coder.

Preff: - Airflow

Must have experience: -

Python

Spark framework and streaming

exposure to Machine Learning Lifecycle is mandatory.

Project:

This is searching domain project. Any searching activity which is happening on website this team create the model for the same, they create sorting/scored model for any search. This is done by the data

scientist This team is working more on the streaming side of data, the candidate would work extensively on Spark streaming and there will be a lot of work in Machine Learning.


INTERVIEW INFORMATION

3-4 rounds.

1st round based on data engineering batching experience.

2nd round based on data engineering streaming experience.

3rd round based on ML lifecycle (3rd round can be a techno-functional round based on previous

feedbacks otherwise 4th round will be a functional round if required.

Read more
A modern ayurvedic nutrition brand
Bengaluru (Bangalore)
1.5 - 3 yrs
₹5L - ₹5L / yr
skill iconData Analytics
Data Analyst
MS-Excel
SQL
skill iconPython
+1 more
About the role:
We are looking for a motivated data analyst with sound experience in handling web/ digital analytics, to join us as part of the Kapiva D2C Business Team. This team is primarily responsible for driving sales and customer engagement on our website. This channel has grown 5x in revenue over the last 12 months and is poised to grow another 5x over the next six. It represents a high-growth, important part of our overall e-commerce growth strategy.
The mandate here is to run an end-to-end sustainable e-commerce business, boost sales through marketing campaigns, and build a cutting edge product (website) that optimizes the customer’s journey as well as increases customer lifetime value.
The Data Analyst will support the business heads by providing data-backed insights in order to drive customer growth, retention and engagement. They will be required to set-up and manage reports, test various hypotheses and coordinate with various stakeholders on a day-to-day basis.


Job Responsibilities:
Strategy and planning:
● Work with the D2C functional leads and support analytics planning on a quarterly/ annual basis
● Identify reports and analytics needed to be conducted on a daily/ weekly/ monthly frequency
● Drive planning for hypothesis-led testing of key metrics across the customer funnel
Analytics:
● Interpret data, analyze results using statistical techniques and provide ongoing reports
● Analyze large amounts of information to discover trends and patterns
● Work with business teams to prioritize business and information needs
● Collaborate with engineering and product development teams to setup data infrastructure as needed

Reporting and communication:
● Prepare reports / presentations to present actionable insights that can drive business objectives
● Setup live dashboards reporting key cross-functional metrics
● Coordinate with various stakeholders to collect useful and required data
● Present findings to business stakeholders to drive action across the organization
● Propose solutions and strategies to business challenges

Requirements sought:
Must haves:
● Bachelor’s/ Masters in Mathematics, Economics, Computer Science, Information Management, Statistics or related field
● High proficiency in MS Excel and SQL
● Knowledge of one or more programming languages like Python/ R. Adept at queries, report writing and presenting findings
● Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy - working knowledge of statistics and statistical methods
● Ability to work in a highly dynamic environment across cross-functional teams; good at
coordinating with different departments and managing timelines
● Exceptional English written/verbal communication
● A penchant for understanding consumer traits and behavior and a keen eye to detail

Good to have:
● Hands-on experience with one or more web analytics tools like Google Analytics, Mixpanel, Kissmetrics, Heap, Adobe Analytics, etc.
● Experience in using business intelligence tools like Metabase, Tableau, Power BI is a plus
● Experience in developing predictive models and machine learning algorithms
Read more
Air India
Agency job
via Seven N Half by Susmitha Goddindla
Kochi (Cochin)
2 - 13 yrs
₹3L - ₹15L / yr
ADF
Oracle ADF
data pipelines
skill iconPython

ADF Developer with top Conglomerates for Kochi location_ Air India

conducting F2F Interviews on 22nd April 2023

Experience - 2-12 years.

Location - Kochi only (work from the office only)


Notice period - 1 month only.


If you are interested, please share the following information at your earliest

Read more
RandomTrees
at RandomTrees
1 recruiter
Amareswarreddt yaddula
Posted by Amareswarreddt yaddula
Hyderabad
5 - 16 yrs
₹1L - ₹30L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
SQL
+3 more

We are #hiring for AWS Data Engineer expert to join our team


Job Title: AWS Data Engineer

Experience: 5 Yrs to 10Yrs

Location: Remote

Notice: Immediate or Max 20 Days

Role: Permanent Role


Skillset: AWS, ETL, SQL, Python, Pyspark, Postgres DB, Dremio.


Job Description:

 Able to develop ETL jobs.

Able to help with data curation/cleanup, data transformation, and building ETL pipelines.

Strong Postgres DB exp and knowledge of Dremio data visualization/semantic layer between DB and the application is a plus.

Sql, Python, and Pyspark is a must.

Communication should be good





Read more
Bungee Tech India
Abigail David
Posted by Abigail David
Remote, NCR (Delhi | Gurgaon | Noida), Chennai
5 - 10 yrs
₹10L - ₹30L / yr
Big Data
Hadoop
Apache Hive
Spark
ETL
+3 more

Company Description

At Bungee Tech, we help retailers and brands meet customers everywhere and, on every occasion, they are in. We believe that accurate, high-quality data matched with compelling market insights empowers retailers and brands to keep their customers at the center of all innovation and value they are delivering. 

 

We provide a clear and complete omnichannel picture of their competitive landscape to retailers and brands. We collect billions of data points every day and multiple times in a day from publicly available sources. Using high-quality extraction, we uncover detailed information on products or services, which we automatically match, and then proactively track for price, promotion, and availability. Plus, anything we do not match helps to identify a new assortment opportunity.

 

Empowered with this unrivalled intelligence, we unlock compelling analytics and insights that once blended with verified partner data from trusted sources such as Nielsen, paints a complete, consolidated picture of the competitive landscape.

We are looking for a Big Data Engineer who will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.

You will also be responsible for integrating them with the architecture used in the company.

 

We're working on the future. If you are seeking an environment where you can drive innovation, If you want to apply state-of-the-art software technologies to solve real world problems, If you want the satisfaction of providing visible benefit to end-users in an iterative fast paced environment, this is your opportunity.

 

Responsibilities

As an experienced member of the team, in this role, you will:

 

  • Contribute to evolving the technical direction of analytical Systems and play a critical role their design and development

 

  • You will research, design and code, troubleshoot and support. What you create is also what you own.

 

  • Develop the next generation of automation tools for monitoring and measuring data quality, with associated user interfaces.

 

  • Be able to broaden your technical skills and work in an environment that thrives on creativity, efficient execution, and product innovation.

 

BASIC QUALIFICATIONS

  • Bachelor’s degree or higher in an analytical area such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.
  • 5+ years relevant professional experience in Data Engineering and Business Intelligence
  • 5+ years in with Advanced SQL (analytical functions), ETL, Data Warehousing.
  • Strong knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures, data modeling and performance tuning.
  • Ability to effectively communicate with both business and technical teams.
  • Excellent coding skills in Java, Python, C++, or equivalent object-oriented programming language
  • Understanding of relational and non-relational databases and basic SQL
  • Proficiency with at least one of these scripting languages: Perl / Python / Ruby / shell script

 

PREFERRED QUALIFICATIONS

 

  • Experience with building data pipelines from application databases.
  • Experience with AWS services - S3, Redshift, Spectrum, EMR, Glue, Athena, ELK etc.
  • Experience working with Data Lakes.
  • Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
  • Sharp problem solving skills and ability to resolve ambiguous requirements
  • Experience on working with Big Data
  • Knowledge and experience on working with Hive and the Hadoop ecosystem
  • Knowledge of Spark
  • Experience working with Data Science teams
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune
2.5 - 6 yrs
₹1L - ₹8L / yr
Big Data
Hadoop
Apache Hive
Spark
Data engineering
+3 more
Job Title/Designation: Big Data Engineers - Hadoop, Pig, Hive, Spark
Employment Type: Full Time, Permanent

Job Description:
 
Work Location - Pune
Work Experience - 2.5 to 6 Years
 
Note - Candidates with short notice periods will be given preference.
 
Mandatory Skills:
  • Working knowledge and hands-on experience of Big Data / Hadoop tools and technologies.
  • Experience of working in Pig, Hive, Flume, Sqoop, Kafka etc.
  • Database development experience with a solid understanding of core database concepts, relational database design, ODS & DWH.
  • Expert level knowledge of SQL and scripting preferably UNIX shell scripting, Perl scripting.
  • Working knowledge of Data integration solution and well-versed with any ETL tool (Informatica / Datastage / Abinitio/Pentaho etc).
  • Strong problem solving and logical reasoning ability.
  • Excellent understanding of all aspects of the Software Development Lifecycle.
  • Excellent written and verbal communication skills.
  • Experience in Java will be an added advantage
  • Knowledge of object oriented programming concepts
  • Exposure to ISMS policies and procedures.
Read more
BDI Plus Lab
at BDI Plus Lab
2 recruiters
Silita S
Posted by Silita S
Bengaluru (Bangalore)
3 - 7 yrs
₹5L - ₹12L / yr
Big Data
Hadoop
skill iconJava
skill iconPython
PySpark
+1 more

Roles and responsibilities:

 

  1. Responsible for development and maintenance of applications with technologies involving Enterprise Java and Distributed  technologies.
  2. Experience in Hadoop, Kafka, Spark, Elastic Search, SQL, Kibana, Python, experience w/ machine learning and Analytics     etc.
  3. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements..
  4. Collaborate with QA team to define test cases, metrics, and resolve questions about test results.
  5. Assist in the design and implementation process for new products, research and create POC for possible solutions.
  6. Develop components based on business and/or application requirements
  7. Create unit tests in accordance with team policies & procedures
  8. Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process
  9. Work with cross-functional teams during crisis to address and resolve complex incidents and problems in addition to assessment, analysis, and resolution of cross-functional issues. 
Read more
Bengaluru (Bangalore)
4 - 10 yrs
₹9L - ₹15L / yr
skill iconR Programming
skill iconData Analytics
Predictive modelling
Supply Chain Management (SCM)
SQL
+4 more

The person holding this position is responsible for leading the solution development and implementing advanced analytical approaches across a variety of industries in the supply chain domain.

At this position you act as an interface between the delivery team and the supply chain team, effectively understanding the client business and supply chain.

Candidates will be expected to lead projects across several areas such as

  • Demand forecasting
  • Inventory management
  • Simulation & Mathematical optimization models.
  • Procurement analytics
  • Distribution/Logistics planning
  • Network planning and optimization

 

Qualification and Experience

  • 4+ years of analytics experience in supply chain – preferable industries hi-tech, consumer technology, CPG, automobile, retail or e-commerce supply chain.
  • Master in Statistics/Economics or MBA or M. Sc./M. Tech with Operations Research/Industrial Engineering/Supply Chain
  • Hands-on experience in delivery of projects using statistical modelling

Skills / Knowledge

  • Hands on experience in statistical modelling software such as R/ Python and SQL.
  • Experience in advanced analytics / Statistical techniques – Regression, Decision tress, Ensemble machine learning algorithms etc. will be considered as an added advantage.
  • Highly proficient with Excel, PowerPoint and Word applications.
  • APICS-CSCP or PMP certification will be added advantage
  • Strong knowledge of supply chain management
  • Working knowledge on the linear/nonlinear optimization
  • Ability to structure problems through a data driven decision-making process.
  • Excellent project management skills, including time and risk management and project structuring.
  • Ability to identify and draw on leading-edge analytical tools and techniques to develop creative approaches and new insights to business issues through data analysis.
  • Ability to liaison effectively with multiple stakeholders and functional disciplines.
  • Experience in Optimization tools like Cplex, ILOG, GAMS will be an added advantage.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos