Cutshort logo
Intuitive Technology Partners logo
Sr. Cloud IoT Engineer
Intuitive Technology Partners's logo

Sr. Cloud IoT Engineer

shalu Jain's profile picture
Posted by shalu Jain
10 - 20 yrs
₹15L - ₹35L / yr
Remote only
Skills
Internet of Things (IOT)
skill iconAmazon Web Services (AWS)
cloud native
sitewise
AWS Lambda
Performance tuning
CI/CD
RESTful APIs
Windows Azure
Google Cloud Platform (GCP)

As a Sr. Cloud IoT Engineer with Intuitive, you will be responsible for the data acquisition from devices and sensors that allows the device to connect seamlessly with other systems. You will be researching, creating, testing, and documenting IoT solutions with integrated systems and devices to help the analytics and data science initiatives across our enterprise customers. 

You will be working closely with SMEs in Data Engineering and Cloud Engineering, to create solutions and extend Intuitive's DataOps Engineering Projects and Initiatives. The Sr. Cloud IoT Engineer will be a central critical role for establishing the DataOps/DataX data logistics and management for building data pipelines, enforcing best practices, ownership for building complex and performant Data Lake Environments, work closely with Cloud Infrastructure Architects and DevSecOps automation teams. The Sr. Cloud IoT Engineer is the main point of contact for all things related to ingestion of telemetry data and saturation into time series or other databases. In this role, we expect our DataOps leaders to be obsessed with telemetry data and providing insights to help our end customers. 

  KEY REQUIREMENTS: 

  • 10+ years’ experience as data engineer. 
  • Must have 5+ Years in implementing IoT engineering solutions with multiple cloud providers and toolsets. 
  • This is hands on role building data pipelines using Cloud Native and Partner Solutions. Hands-on technical experience with Data at Scale. 
  • Must have deep understanding of solutions like AWS IoT Greengrass edge runtime and cloud services to build, deploy, and manage device software across the fleet of devices.  
  • Hands on experience with AWS IoT stack like Core, SiteWise, Kinesis, Lamda, Timestream 
  • Performance Tuning of streaming telemetry data based with tools like Grafana, Amazon Cloudwatch and QuickSight informed by business requirements. 
  • Good working experience on Web Service Integration, RESTful APIs, WebSockets and MQTT.
  • Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration    

 

 

 

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Intuitive Technology Partners

Founded
Type
Size
Stage
About
N/A
Connect with the team
Profile picture
Aakriti Gupta
Profile picture
Shwetangi Goyal
Profile picture
Dhinesh Raja
Profile picture
shalu Jain
Company social profiles
N/A

Similar jobs

SteelEye
at SteelEye
1 video
3 recruiters
Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
1 - 8 yrs
₹10L - ₹40L / yr
skill iconPython
ETL
skill iconJenkins
CI/CD
pandas
+6 more
Roles & Responsibilties
Expectations of the role
This role will be reporting into Technical Lead (Support). You will be expected to resolve bugs in the platform that are identified by Customers and Internal Teams. This role will progress towards SDE-2 in 12-15 months where the developer will be working on solving complex problems around scale and building out new features.
 
What will you do?
  • Fix issues with plugins for our Python-based ETL pipelines
  • Help with automation of standard workflow
  • Deliver Python microservices for provisioning and managing cloud infrastructure
  • Responsible for any refactoring of code
  • Effectively manage challenges associated with handling large volumes of data working to tight deadlines
  • Manage expectations with internal stakeholders and context-switch in a fast-paced environment
  • Thrive in an environment that uses AWS and Elasticsearch extensively
  • Keep abreast of technology and contribute to the engineering strategy
  • Champion best development practices and provide mentorship to others
What are we looking for?
  • First and foremost you are a Python developer, experienced with the Python Data stack
  • You love and care about data
  • Your code is an artistic manifest reflecting how elegant you are in what you do
  • You feel sparks of joy when a new abstraction or pattern arises from your code
  • You support the manifests DRY (Don’t Repeat Yourself) and KISS (Keep It Short and Simple)
  • You are a continuous learner
  • You have a natural willingness to automate tasks
  • You have critical thinking and an eye for detail
  • Excellent ability and experience of working to tight deadlines
  • Sharp analytical and problem-solving skills
  • Strong sense of ownership and accountability for your work and delivery
  • Excellent written and oral communication skills
  • Mature collaboration and mentoring abilities
  • We are keen to know your digital footprint (community talks, blog posts, certifications, courses you have participated in or you are keen to, your personal projects as well as any kind of contributions to the open-source communities if any)
Nice to have:
  • Delivering complex software, ideally in a FinTech setting
  • Experience with CI/CD tools such as Jenkins, CircleCI
  • Experience with code versioning (git / mercurial / subversion)
Read more
Fintech Company
Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
1 - 3 yrs
₹9L - ₹12L / yr
skill iconPython
SQL
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
+3 more
Job Responsibilities:

1+ years of proven experience in ML/AI with Python

Work with the manager through the entire analytical and machine learning model life cycle:

⮚ Define the problem statement

⮚ Build and clean datasets

⮚ Exploratory data analysis

⮚ Feature engineering

⮚ Apply ML algorithms and assess the performance

⮚ Codify for deployment

⮚ Test and troubleshoot the code

⮚ Communicate analysis to stakeholders

Technical Skills

⮚ Proven experience in usage of Python and SQL

⮚ Excellent in programming and statistics

⮚ Working knowledge of tools and utilities - AWS, DevOps with Git, Selenium, Postman, Airflow, PySpark

Read more
Monarch Tractors India
Hyderabad
5 - 8 yrs
Best in industry
skill iconPython
skill iconAmazon Web Services (AWS)
skill iconPostgreSQL
Ubuntu
Web Service Definition Language (WSDL)

Designation: Principal Data Engineer

Experience: Experienced

Position Type: Full Time Position

Location: Hyderabad

Office Timings: 9AM to 6PM

Compensation: As Per Industry standards

 

About Monarch:

 

At Monarch, we’re leading the digital transformation of farming. Monarch Tractor augments both muscle and mind with fully loaded hardware, software, and service machinery that will spur future generations of farming technologies. With our farmer-first mentality, we are building a smart tractor that will enhance (not replace) the existing farm ecosystem, alleviate labor availability, and cost issues, and provide an avenue for competitive organic and beyond farming by providing mechanical solutions to replace harmful chemical solutions. Despite all the cutting-edge technology we will incorporate, our tractor will still plow, till, and haul better than any other tractor in its class. We have all the necessary ingredients to develop, build and scale the Monarch Tractor and digitally transform farming around the world.

 

Description:

 

Monarch Tractor likes to invite an experience Python data engineer to lead our internal data engineering team in India. This is a unique opportunity to work on computer vision AI data pipelines for electric tractors. You will be dealing with data from a farm environment like videos, images, tractor logs, GPS coordinates and map polygons. You will be responsible for collecting data for research and development. For example, this includes setting up ETL data pipelines to extract data from tractors, loading these data into the cloud and recording AI training results.

 

This role includes, but not limited to, the following tasks:

 

● Lead data engineering team

● Own and contribute to more than 50% of the data engineering code base

● Scope out new project requirements

● Costing data pipeline solutions

● Create data engineering tooling

● Design custom data structures for efficient processing of data

 

Data engineering skills we are looking for:

 

● Able to work with large amounts of text log data, image data, and video data

● Fluently use AWS cloud solutions like S3, Lambda, and EC2

● Able to work with data from Robot Operating System

 

Required Experience:

 

● 3 to 5 years of experience using Python

● 3 to 5 years of experience using PostgreSQL

● 3 to 5 years of experience using AWS EC2, S3, Lambda

● 3 to 5 years of experience using Ubuntu OS or WSL

 

Good to have experience:

 

● Ray

● Robot Operating System

 

What you will get:

 

At Monarch Tractor, you’ll play a key role on a capable, dedicated, high-performing team of rock stars. Our compensation package includes a competitive salary, excellent health, dental and vision benefits, and company equity commensurate with the role you’ll play in our success. 

Read more
Mactores Cognition Private Limited
Remote only
5 - 15 yrs
₹5L - ₹21L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
Amazon S3
+3 more

Mactores is a trusted leader among businesses in providing modern data platform solutions. Since 2008, Mactores have been enabling businesses to accelerate their value through automation by providing End-to-End Data Solutions that are automated, agile, and secure. We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization.


We are looking for a DataOps Engineer with expertise while operating a data lake. Amazon S3, Amazon EMR, and Apache Airflow for workflow management are used to build the data lake.


You have experience of building and running data lake platforms on AWS. You have exposure to operating PySpark-based ETL Jobs in Apache Airflow and Amazon EMR. Expertise in monitoring services like Amazon CloudWatch.


If you love solving problems using yo, professional services background, usual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.


What you will do?


  • Operate the current data lake deployed on AWS with Amazon S3, Amazon EMR, and Apache Airflow
  • Debug and fix production issues in PySpark.
  • Determine the RCA (Root cause analysis) for production issues.
  • Collaborate with product teams for L3/L4 production issues in PySpark.
  • Contribute to enhancing the ETL efficiency
  • Build CloudWatch dashboards for optimizing the operational efficiencies
  • Handle escalation tickets from L1 Monitoring engineers
  • Assign the tickets to L1 engineers based on their expertise


What are we looking for?


  • AWS data Ops engineer.
  • Overall 5+ years of exp in the software industry Exp in developing architecture data applications using python or scala, Airflow, and Kafka on AWS Data platform Experience and expertise.
  • Must have set up or led the project to enable Data Ops on AWS or any other cloud data platform.
  • Strong data engineering experience on Cloud platform, preferably AWS.
  • Experience with data pipelines designed for reuse and use parameterization.
  • Experience of pipelines was designed to solve common ETL problems.
  • Understanding or experience on various AWS services can be codified for enabling DataOps like Amazon EMR, Apache Airflow.
  • Experience in building data pipelines using CI/CD infrastructure.
  • Understanding of Infrastructure as code for DataOps ennoblement.
  • Ability to work with ambiguity and create quick PoCs.


You will be preferred if


  • Expertise in Amazon EMR, Apache Airflow, Terraform, CloudWatch
  • Exposure to MLOps using Amazon Sagemaker is a plus.
  • AWS Solutions Architect Professional or Associate Level Certificate
  • AWS DevOps Professional Certificate


Life at Mactores


We care about creating a culture that makes a real difference in the lives of every Mactorian. Our 10 Core Leadership Principles that honor Decision-making, Leadership, Collaboration, and Curiosity drive how we work.


1. Be one step ahead

2. Deliver the best

3. Be bold

4. Pay attention to the detail

5. Enjoy the challenge

6. Be curious and take action

7. Take leadership

8. Own it

9. Deliver value

10. Be collaborative


We would like you to read more details about the work culture on https://mactores.com/careers 


The Path to Joining the Mactores Team

At Mactores, our recruitment process is structured around three distinct stages:


Pre-Employment Assessment: 

You will be invited to participate in a series of pre-employment evaluations to assess your technical proficiency and suitability for the role.


Managerial Interview: The hiring manager will engage with you in multiple discussions, lasting anywhere from 30 minutes to an hour, to assess your technical skills, hands-on experience, leadership potential, and communication abilities.


HR Discussion: During this 30-minute session, you'll have the opportunity to discuss the offer and next steps with a member of the HR team.


At Mactores, we are committed to providing equal opportunities in all of our employment practices, and we do not discriminate based on race, religion, gender, national origin, age, disability, marital status, military status, genetic information, or any other category protected by federal, state, and local laws. This policy extends to all aspects of the employment relationship, including recruitment, compensation, promotions, transfers, disciplinary action, layoff, training, and social and recreational programs. All employment decisions will be made in compliance with these principles.


Read more
Latent Bridge Pvt Ltd
at Latent Bridge Pvt Ltd
6 recruiters
Mansoor Khan
Posted by Mansoor Khan
Remote only
3 - 7 yrs
₹5L - ₹20L / yr
MicroStrategy administration
skill iconAmazon Web Services (AWS)
Business Intelligence (BI)
MSTR

Familiar with the MicroStrategy architecture, Admin Certification Preferred

· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes

· Monitor and manage existing Business Intelligence development/production systems

· MicroStrategy installation, upgrade and administration on Windows and Linux platform

· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.

· Analyze application and system logs while troubleshooting and root cause analysis

· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.

· Monitor, report and investigate solutions to improve report performance.

· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.

· Provide support for the platform, report execution and implementation, user community and data investigations.

· Identify improvement areas in Environment hosting and upgrade processes.

· Identify automation opportunities and participate in automation implementations

· Provide on-call support for Business Intelligence issues

· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.

· Familiar with AWS, Linux Scripting

· Knowledge of MSTR Mobile

· Knowledge of capacity planning and system’s scaling needs

Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
Chennai, Hyderabad
5 - 10 yrs
₹10L - ₹25L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more

Bigdata with cloud:

 

Experience : 5-10 years

 

Location : Hyderabad/Chennai

 

Notice period : 15-20 days Max

 

1.  Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight

2.  Experience in developing lambda functions with AWS Lambda

3.  Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark

4.  Should be able to code in Python and Scala.

5.  Snowflake experience will be a plus

Read more
Nascentvision
at Nascentvision
1 recruiter
Shanu Mohan
Posted by Shanu Mohan
Gurugram, Mumbai, Bengaluru (Bangalore)
2 - 4 yrs
₹10L - ₹17L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
Spark
skill iconScala
+2 more
  • Hands-on experience in any Cloud Platform
· Versed in Spark, Scala/python, SQL
  • Microsoft Azure Experience
· Experience working on Real Time Data Processing Pipeline
Read more
Freelancer
at Freelancer
4 recruiters
Nirmala Hk
Posted by Nirmala Hk
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹35L / yr
skill iconPython
Shell Scripting
MySQL
SQL
skill iconAmazon Web Services (AWS)
+3 more

   3+ years of experience in deployment, monitoring, tuning, and administration of high concurrency MySQL production databases.

  • Solid understanding of writing optimized SQL queries on MySQL databases
  • Understanding of AWS, VPC, networking, security groups, IAM, and roles.
  • Expertise in scripting in Python or Shell/Powershell
  • Must have experience in large scale data migrations
  • Excellent communication skills.
Read more
MNC
at MNC
Agency job
via I Squaresoft by Khadri SH
Remote only
5 - 8 yrs
₹10L - ₹20L / yr
ETL
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
SSIS
Cloud Datawarehouse
Hi,

job Description

Problem Formulation: Identifies possible options to address the business problems and must possess good understanding of dimension modelling

Must have worked on at least one end to end project using any Cloud Datawarehouse (Azure Synapses, AWS Redshift, Google Big query)

Good to have an understand of POWER BI and integration with any Cloud services like Azure or GCP

Experience of working with SQL Server, SSIS(Preferred)

Applied Business Acumen: Supports the development of business cases and recommendations. Owns delivery of project activity and tasks assigned by others. Supports process updates and changes. Solves business issues.

Data Transformation/Integration/Optimization:

The ETL developer is responsible for designing and creating the Data warehouse and all related data extraction, transformation and load of data function in the company

The developer should provide the oversight and planning of data models, database structural design and deployment and work closely with the data architect and Business analyst

Duties include working in a cross functional software development teams (Business analyst, Testers, Developers) following agile ceremonies and development practices.

The developer plays a key role in contributing to the design, evaluation, selection, implementation and support of databases solution.

Development and Testing: Develops codes for the required solution by determining the appropriate approach and leveraging business, technical, and data requirements.

Creates test cases to review and validate the proposed solution design. Work on POCs and deploy the software to production servers.

Good to Have (Preferred Skills):

  • Minimum 4-8 Years of experience in Data warehouse design and development for large scale application
  • Minimum 3 years of experience with star schema, dimensional modelling and extract transform load (ETL) Design and development
  • Expertise working with various databases (SQL Server, Oracle)
  • Experience developing Packages, Procedures, Views and triggers
  • Nice to have Big data technologies
  • The individual must have good written and oral communication skills.
  • Nice to have SSIS

 

Education and Experience

  • Minimum 4-8 years of software development experience
  • Bachelor's and/or Master’s degree in computer science

Please revert back with below details.

Total Experience:
Relevant Experience:

Current CTC:
Expected CTC:

Any offers: Y/N

Notice Period:

Qualification:

DOB:
Present Company Name:

Designation:

Domain

Reason for job change:

Current Location:

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos