Cutshort logo
British Telecom logo
Data Engineer
British Telecom's logo

Data Engineer

Agency job
3 - 7 yrs
₹8L - ₹14L / yr
Bengaluru (Bangalore)
Skills
Data engineering
Big Data
Google Cloud Platform (GCP)
ETL
Datawarehousing
skill iconAmazon Web Services (AWS)
Windows Azure
skill iconPython
skill iconJava
DevOps
JIRA
You'll have the following skills & experience:

• Problem Solving:. Resolving production issues to fix service P1-4 issues. Problems relating to
introducing new technology, and resolving major issues in the platform and/or service.
• Software Development Concepts: Understands and is experienced with the use of a wide range of
programming concepts and is also aware of and has applied a range of algorithms.
• Commercial & Risk Awareness: Able to understand & evaluate both obvious and subtle commercial
risks, especially in relation to a programme.
Experience you would be expected to have
• Cloud: experience with one of the following cloud vendors: AWS, Azure or GCP
• GCP : Experience prefered, but learning essential.
• Big Data: Experience with Big Data methodology and technologies
• Programming : Python or Java worked with Data (ETL)
• DevOps: Understand how to work in a Dev Ops and agile way / Versioning / Automation / Defect
Management – Mandatory
• Agile methodology - knowledge of Jira
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About British Telecom

Founded
Type
Size
Stage
About
This blog addresses the health of the British Telecom industry. Includes news, reports, and opinion pieces.
Read more
Company social profiles
twitter

Similar jobs

Sadup Softech
at Sadup Softech
1 recruiter
madhuri g
Posted by madhuri g
Remote only
4 - 6 yrs
₹4L - ₹15L / yr
Google Cloud Platform (GCP)
big query
PySpark
Data engineering
Big Data
+2 more

Job Description:

We are seeking a talented Machine Learning Engineer with expertise in software engineering to join our team. As a Machine Learning Engineer, your primary responsibility will be to develop machine learning (ML) solutions that focus on technology process improvements. Specifically, you will be working on projects involving ML & Generative AI solutions for Technology & Data Management Efficiencies such as optimal cloud computing, knowledge bots, Software Code Assistants, Automatic Data Management etc

 

Responsibilities:

- Collaborate with cross-functional teams to identify opportunities for technology process improvements that can be solved using machine learning and generative AI.

- Define and build innovate ML and Generative AI systems such as AI Assistants for varied SDLC tasks, and improve Data & Infrastructure management etc. 

- Design and develop ML Engineering Solutions, generative AI Applications & Fine-Tuning Large Language Models (LLMs) for above ensuring scalability, efficiency, and maintainability of such solutions.

- Implement prompt engineering techniques to fine-tune and enhance LLMs for better performance and application-specific needs.

- Stay abreast of the latest advancements in the field of Generative AI and actively contribute to the research and development of new ML & Generative AI Solutions.

 

Requirements:

- A Master's or Ph.D. degree in Computer Science, Statistics, Data Science, or a related field.

- Proven experience working as a Software Engineer, with a focus on ML Engineering and exposure to Generative AI Applications such as chatGPT.

- Strong proficiency in programming languages such as Java, Scala, Python, Google Cloud, Biq Query, Hadoop & Spark etc

- Solid knowledge of software engineering best practices, including version control systems (e.g., Git), code reviews, and testing methodologies.

- Familiarity with large language models (LLMs), prompt engineering techniques, vector DB's, embedding & various fine-tuning techniques.

- Strong communication skills to effectively collaborate and present findings to both technical and non-technical stakeholders.

- Proven ability to adapt and learn new technologies and frameworks quickly.

- A proactive mindset with a passion for continuous learning and research in the field of Generative AI.

 

If you are a skilled and innovative Data Scientist with a passion for Generative AI, and have a desire to contribute to technology process improvements, we would love to hear from you. Join our team and help shape the future of our AI Driven Technology Solutions.

Read more
Marktine
at Marktine
1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹24L / yr
skill iconData Science
skill iconR Programming
skill iconPython
SQL
skill iconMachine Learning (ML)
+1 more

Responsibilities:

  • Design and develop strong analytics system and predictive models
  • Managing a team of data scientists, machine learning engineers, and big data specialists
  • Identify valuable data sources and automate data collection processes
  • Undertake pre-processing of structured and unstructured data
  • Analyze large amounts of information to discover trends and patterns
  • Build predictive models and machine-learning algorithms
  • Combine models through ensemble modeling
  • Present information using data visualization techniques
  • Propose solutions and strategies to business challenges
  • Collaborate with engineering and product development teams

Requirements:

  • Proven experience as a seasoned Data Scientist
  • Good Experience in data mining processes
  • Understanding of machine learning and Knowledge of operations research is a value addition
  • Strong understanding and experience in R, SQL, and Python; Knowledge base with Scala, Java, or C++ is an asset
  • Experience using business intelligence tools (e. g. Tableau) and data frameworks (e. g. Hadoop)
  • Strong math skills (e. g. statistics, algebra)
  • Problem-solving aptitude
  • Excellent communication and presentation skills
  • Experience in Natural Language Processing (NLP)
  • Strong competitive coding skills
  • BSc/BA in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative field is preferred
Read more
E-Commerce Aggregator Platform
E-Commerce Aggregator Platform
Agency job
via Qrata by Blessy Fernandes
Remote only
4 - 8 yrs
₹14L - ₹22L / yr
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
skill iconPython
Algorithms
skill iconDeep Learning
6+ years of applied machine learning experience with a focus on natural language processing. Some of our current projects require knowledge of natural language
generation.
o 3+ years of software engineering experience.
o Advanced knowledge of Python, with 2+ years in a production environment.
o Experience with practical applications of deep learning.
o Experience with agile, test-driven development, continuous integration, and automated testing.
o Experience with productionizing machine learning models and integrating into web- services.
o Experience with the full software development life cycle, including requirements collection, design, implementation, testing, and operational support.
o Excellent verbal and written communication, teamwork, decision making and influencing
skills.
o Hustle. Thrives in an evolving, fast paced, ambiguous work environment.
Read more
ketteq
at ketteq
1 recruiter
Nikhil Jain
Posted by Nikhil Jain
Remote only
5 - 15 yrs
₹20L - ₹35L / yr
ETL
SQL
skill iconPostgreSQL

ketteQ is a supply chain planning and automation platform. We are looking for extremely strong and experienced Technical Consultant to help with system design, data engineering and software configuration and testing during the implementation of supply chain planning solutions. This job comes with a very attractive compensation package, and work-from-home benefit. If you are high-energy, motivated, and initiative-taking individual then this could be a fantastic opportunity for you.

 

Responsible for technical design and implementation of supply chain planning solutions.   

 

 

Responsibilities

  • Design and document system architecture
  • Design data mappings
  • Develop integrations
  • Test and validate data
  • Develop customizations
  • Deploy solution
  • Support demo development activities

Requirements

  • Minimum 5 years experience in technical implementation of Enterprise software preferably Supply Chain Planning software
  • Proficiency in ANSI/postgreSQL
  • Proficiency in ETL tools such as Pentaho, Talend, Informatica, and Mulesoft
  • Experience with Webservices and REST APIs
  • Knowledge of AWS
  • Salesforce and Tableau experience a plus
  • Excellent analytical skills
  • Must possess excellent verbal and written communication skills and be able to communicate effectively with international clients
  • Must be a self-starter and highly motivated individual who is looking to make a career in supply chain management
  • Quick thinker with proven decision-making and organizational skills
  • Must be flexible to work non-standard hours to accommodate globally dispersed teams and clients

Education

  • Bachelors in Engineering from a top-ranked university with above average grades
Read more
Ganit Business Solutions
at Ganit Business Solutions
3 recruiters
Vijitha VS
Posted by Vijitha VS
Remote only
4 - 7 yrs
₹10L - ₹30L / yr
skill iconScala
ETL
Informatica
Data Warehouse (DWH)
Big Data
+4 more

Job Description:

We are looking for a Big Data Engineer who have worked across the entire ETL stack. Someone who has ingested data in a batch and live stream format, transformed large volumes of daily and built Data-warehouse to store the transformed data and has integrated different visualization dashboards and applications with the data stores.    The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.

Responsibilities:

  • Develop, test, and implement data solutions based on functional / non-functional business requirements.
  • You would be required to code in Scala and PySpark daily on Cloud as well as on-prem infrastructure
  • Build Data Models to store the data in a most optimized manner
  • Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Implementing the ETL process and optimal data pipeline architecture
  • Monitoring performance and advising any necessary infrastructure changes.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Proactively identify potential production issues and recommend and implement solutions
  • Must be able to write quality code and build secure, highly available systems.
  • Create design documents that describe the functionality, capacity, architecture, and process.
  • Review peer-codes and pipelines before deploying to Production for optimization issues and code standards

Skill Sets:

  • Good understanding of optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
  • Proficient understanding of distributed computing principles
  • Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
  • Implemented complex projects dealing with the considerable data size (PB).
  • Optimization techniques (performance, scalability, monitoring, etc.)
  • Experience with integration of data from multiple data sources
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
  • Knowledge of various ETL techniques and frameworks, such as Flume
  • Experience with various messaging systems, such as Kafka or RabbitMQ
  • Creation of DAGs for data engineering
  • Expert at Python /Scala programming, especially for data engineering/ ETL purposes

 

 

 

Read more
Srijan Technologies
at Srijan Technologies
6 recruiters
PriyaSaini
Posted by PriyaSaini
Remote only
3 - 8 yrs
₹5L - ₹12L / yr
skill iconData Analytics
Data modeling
skill iconPython
PySpark
ETL
+3 more

Role Description:

  • You will be part of the data delivery team and will have the opportunity to develop a deep understanding of the domain/function.
  • You will design and drive the work plan for the optimization/automation and standardization of the processes incorporating best practices to achieve efficiency gains.
  • You will run data engineering pipelines, link raw client data with data model, conduct data assessment, perform data quality checks, and transform data using ETL tools.
  • You will perform data transformations, modeling, and validation activities, as well as configure applications to the client context. You will also develop scripts to validate, transform, and load raw data using programming languages such as Python and / or PySpark.
  • In this role, you will determine database structural requirements by analyzing client operations, applications, and programming.
  • You will develop cross-site relationships to enhance idea generation, and manage stakeholders.
  • Lastly, you will collaborate with the team to support ongoing business processes by delivering high-quality end products on-time and perform quality checks wherever required.

Job Requirement:

  • Bachelor’s degree in Engineering or Computer Science; Master’s degree is a plus
  • 3+ years of professional work experience with a reputed analytics firm
  • Expertise in handling large amount of data through Python or PySpark
  • Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools
  • Experience of deploying ETL / data pipelines and workflows in cloud technologies and architecture such as Azure and Amazon Web Services will be valued
  • Comfort with data modelling principles (e.g. database structure, entity relationships, UID etc.) and software development principles (e.g. modularization, testing, refactoring, etc.)
  • A thoughtful and comfortable communicator (verbal and written) with the ability to facilitate discussions and conduct training
  • Strong problem-solving, requirement gathering, and leading.
  • Track record of completing projects successfully on time, within budget and as per scope

Read more
Credit Saison Finance Pvt Ltd
Najma Khanum
Posted by Najma Khanum
Remote, Bengaluru (Bangalore)
3 - 7 yrs
₹12L - ₹30L / yr
skill iconData Science
skill iconR Programming
skill iconPython
Role & Responsibilities:
1) Understand the business objectives, formulate hypotheses and collect the relevant data using SQL/R/Python. Analyse bureau, customer and lending performance data on a periodic basis to generate insights. Present complex information and data in an uncomplicated, easyto-understand way to drive action.
2) Independently Build and refit robust models for achieving game-changing growth while managing risk.
3) Identify and implement new analytical/modelling techniques to improve model performance across customer lifecycle (acquisitions, management, fraud, collections, etc.
4) Help define the data infrastructure strategy for Indian subsidiary.
a. Monitor data quality and quantity.
b. Define a strategy for acquisition, storage, retention, and retrieval of data elements. e.g.: Identify new data types and collaborate with technology teams to capture them.
c. Build a culture of strong automation and monitoring
d. Staying connected to the Analytics industry trends - data, techniques, technology, etc. and leveraging them to continuously evolve data science standards at Credit Saison.

Required Skills & Qualifications:
1) 3+ years working in data science domains with experience in building risk models. Fintech/Financial analysis experience is required.
2) Expert level proficiency in Analytical tools and languages such as SQL, Python, R/SAS, VBA etc.
3) Experience with building models using common modelling techniques (Logistic and linear regressions, decision trees, etc.)
4) Strong familiarity with Tableau//Power BI/Qlik Sense or other data visualization tools
5) Tier 1 college graduate (IIT/IIM/NIT/BITs preferred).
6) Demonstrated autonomy, thought leadership, and learning agility.
Read more
Service company, helps businesses harness the power of data
Service company, helps businesses harness the power of data
Agency job
via Jobdost by Ankitha Vyas
Remote only
4 - 8 yrs
₹10L - ₹20L / yr
skill iconPython
skill iconRuby
skill iconRuby on Rails (ROR)
Data Structures
Algorithms
+4 more

About the Company:

 It is a Data as a Service company that helps businesses harness the power of data. Our technology fuels some of the most interesting big data projects of the word. We are a small bunch of people working towards shaping the imminent data-driven future by solving some of its fundamental and toughest challenges. 

 

 

Role: We are looking for an experienced team lead to drive data acquisition projects end to end. In this role, you will be working in the web scraping team with data engineers, helping them solve complex web problems and mentor them along the way. You’ll be adept at delivering large-scale web crawling projects, breaking down barriers for your team and planning at a higher level, and getting into the detail to make things happen when needed.  

 

Responsibilities  

  •  Interface with clients and sales team to translate functional requirements into technical requirements 
  •  Plan and estimate tasks with your team, in collaboration with the delivery managers 
  •  Engineer complex data acquisition projects 
  •  Guide and mentor your team of engineers 
  •  Anticipate issues that might arise and proactively consider those into design 
  •  Perform code reviews and suggest design changes 

 

 

Prerequisites 

  •  Between 5-8 years of relevant experience 
  • Fluent programming skills and well-versed with scripting languages like Python or Ruby 
  • Solid foundation in data structures and algorithms 
  • Excellent tech troubleshooting skills 
  • Good understanding of web data landscape 
  • Prior exposure to DOM, XPATH and hands on experience with selenium/automated testing is a plus 

 

Skills and competencies 

  • Prior experience with team handling and people management is mandatory 
  • Work independently with little to no supervision 
  • Extremely high attention to detail  
  •  Ability to juggle between multiple projects  
Read more
Catalyst IQ
at Catalyst IQ
6 recruiters
Sidharth Maholia
Posted by Sidharth Maholia
Mumbai, Bengaluru (Bangalore)
1 - 5 yrs
₹15L - ₹25L / yr
Tableau
SQL
MS-Excel
skill iconPython
skill iconData Analytics
+2 more
Responsibilities:
● Ability to do exploratory analysis: Fetch data from systems and analyze trends.
● Developing customer segmentation models to improve the efficiency of marketing and product
campaigns.
● Establishing mechanisms for cross functional teams to consume customer insights to improve
engagement along the customer life cycle.
● Gather requirements for dashboards from business, marketing and operations stakeholders.
● Preparing internal reports for executive leadership and supporting their decision making.
● Analyse data, derive insights and embed it into Business actions.
● Work with cross functional teams.
Skills Required
• Data Analytics Visionary.
• Strong in SQL & Excel and good to have experience in Tableau.
• Experience in the field of Data Analysis, Data Visualization.
• Strong in analysing the Data and creating dashboards.
• Strong in communication, presentation and business intelligence.
• Multi-Dimensional, "Growth Hacker" Skill Set with strong sense of ownership for work.
• Aggressive “Take no prisoners” approach.
Read more
YourHRfolks
at YourHRfolks
6 recruiters
Pranit Visiyait
Posted by Pranit Visiyait
Jaipur
4 - 6 yrs
₹13L - ₹16L / yr
skill iconPython
SQL
MySQL
Data Visualization
R
+3 more

About Us
Punchh is the leader in customer loyalty, offer management, and AI solutions for offline and omni-channel merchants including restaurants, convenience stores, and retailers. Punchh brings the power of online to physical brands by delivering omni-channel experiences and personalization across the entire customer journey--from acquisition through loyalty and growth--to drive same store sales and customer lifetime value. Punchh uses best-in-class integrations to POS and other in-store systems such as WiFi, to deliver real-time SKU-level transaction visibility and offer provisioning for physical stores.


Punchh is growing exponentially, serves 200+ brands that encompass 91K+ stores globally.  Punchh’s customers include the top convenience stores such as Casey’s General Stores, 25+ of the top 100 restaurant brands such as Papa John's, Little Caesars, Denny’s, Focus Brands (5 of 7 brands), and Yum! Brands (KFC, Pizza Hut, and Taco Bell), and retailers.  For a multi-billion $ brand with 6K+ stores, Punchh drove a 3% lift in same-store sales within the first year.  Punchh is powering loyalty programs for 135+ million consumers. 

Punchh has raised $70 million from premier Silicon Valley investors including Sapphire Ventures and Adam Street Partners, has a seasoned leadership team with extensive experience in digital, marketing, CRM, and AI technologies as well as deep restaurant and retail industry expertise.


About the Role: 

Punchh Tech India Pvt. is looking for a Senior Data Analyst – Business Insights to join our team. If you're excited to be part of a winning team, Punchh is a great place to grow your career.

This position is responsible for discovering the important trends among the complex data generated on Punchh platform, that have high business impact (influencing product features and roadmap). Creating hypotheses around these trends, validate them with statistical significance and make recommendations


Reporting to: Director, Analytics

Job Location: Jaipur

Experience Required: 4-6 years


What You’ll Do

  • Take ownership of custom data analysis projects/requests and work closely with end users (both internal and external clients) to deliver the results
  • Identify successful implementation/utilization of product features and contribute to the best-practices playbook for client facing teams (Customer Success)
  • Strive towards building mini business intelligence products that add value to the client base
  • Represent the company’s expertise in advanced analytics in a variety of media outlets such as client interactions, conferences, blogs, and interviews.

What You’ll Need

  • Masters in business/behavioral economics/statistics with a strong interest in marketing technology
  • Proven track record of at least 5 years uncovering business insights, especially related to Behavioral Economics and adding value to businesses
  • Proficient in using the proper statistical and econometric approaches to establish the presence and strength of trends in data. Strong statistical knowledge is mandatory.
  • Extensive prior exposure in causal inference studies, based on both longitudinal and latitudinal data.
  • Excellent experience using Python (or R) to analyze data from extremely large or complex data sets
  • Exceptional data querying skills (Snowflake/Redshift, Spark, Presto/Athena, to name a few)
  • Ability to effectively articulate complex ideas in simple and effective presentations to diverse groups of stakeholders.
  • Experience working with a visualization tool (preferably, but not restricted to Tableau)
  • Domain expertise: extensive exposure to retail business, restaurant business or worked on loyalty programs and promotion/campaign effectiveness
  • Should be self-organized and be able to proactively identify problems and propose solutions
  • Gels well within and across teams, work with stakeholders from various functions such as Product, Customer Success, Implementations among others
  • As the stakeholders on business side are based out of US, should be flexible to schedule meetings convenient to the West Coast timings
  • Effective in working autonomously to get things done and taking the initiatives to anticipate needs of executive leadership
  • Able and willing to relocate to Jaipur post pandemic.

Benefits:

  • Medical Coverage, to keep you and your family healthy.
  • Compensation that stacks up with other tech companies in your area.
  • Paid vacation days and holidays to rest and relax.
  • Healthy lunch provided daily to fuel you through your work.
  • Opportunities for career growth and training support, including fun team building events.
  • Flexibility and a comfortable work environment for you to feel your best.
 
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos