Cutshort logo
Intelekt AI (Previously Techweirdo) logo
Senior IoT Cloud Engineer
Intelekt AI (Previously Techweirdo)'s logo

Senior IoT Cloud Engineer

Gaurab Patra's profile picture
Posted by Gaurab Patra
5 - 9 yrs
₹14L - ₹15L / yr
Mumbai
Skills
skill iconC#
skill icon.NET
Microsoft Windows Azure
Internet of Things (IOT)
Cloud Computing

Techweirdo delivers AI models & enterprise solutions, globally for mid to large-scale organizations.

We offer consultation, services, and products to holistically address the digital transformation goals of an enterprise.

We are currently hiring passionate, senior IoT cloud engineers on behalf of one of our large customers to help them find the best-fit talent and create technologically challenging, visually delightful, and easy-to-use digital products in a fast-paced environment.

 

Skills/ Role Requirements:

  • Good hands-on experience on Azure IoT Gateway/ IoT Hub development
  • Good hands-on experience to Azure functions, Azure event hub, Azure IoT Edge, and Cloud Platform Security
  • Strong knowledge of C# .NET
  • Industrial IoT experience is a must
  • Device Communication knowledge with exposure to different protocols is an advantage
  • Good communication skills as this position require consistent interaction with business stakeholders and other engineers
  • Hands-on experience in optimization, architecture, building scalable real-time data pipelines is a plus
  • 5 years plus relevant experience

 

 

Perks:

  1. Surrounded by curious learners: With a Growth Mindset as our core strength, we created a learning environment with curious tech learners.
  2. New challenges every day: There is no ordinary day at TechWeirdo, if you like solving problems, then this is the right place for you.
  3. Zero micro-management, limited supervision: We encourage our team to take on challenging tasks and solve complex problems by taking ownership of their tasks. We trust our team to take calculated risks.
  4. Great networking: You will be connected with c-suite executives of top organizations while working with our winning team.
  5. Building technology how you want, when you want: We welcome people who see things differently as they are the one who has the ability to change the world.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Intelekt AI (Previously Techweirdo)

Founded :
2019
Type
Size :
20-100
Stage :
Bootstrapped
About

Unlocking the power of growing businesses with AI

Read more
Connect with the team
Profile picture
Gaurab Patra
Profile picture
Sudeshna Mukhopadhay
Profile picture
Parul Chouhan
Company social profiles
linkedinfacebook

Similar jobs

UpSolve Solutions LLP
Shaurya Kuchhal
Posted by Shaurya Kuchhal
Mumbai
2 - 4 yrs
₹7L - ₹11L / yr
skill iconMachine Learning (ML)
skill iconData Science
Microsoft Windows Azure
Google Cloud Platform (GCP)
skill iconPython
+3 more

About UpSolve

Work on cutting-edge tech stack. Build innovative solutions. Computer Vision, NLP, Video Analytics and IOT.


Job Role

  • Ideate use cases to include recent tech releases.
  • Discuss business plans and assist teams in aligning with dynamic KPIs.
  • Design solution architecture from input to infrastructure and services used to data store.


Job Requirements

  • Working knowledge about Azure Cognitive Services.
  • Project Experience in building AI solutions like Chatbots, sentiment analysis, Image Classification, etc.
  • Quick Learner and Problem Solver.


Job Qualifications

  • Work Experience: 2 years +
  • Education: Computer Science/IT Engineer
  • Location: Mumbai
Read more
Episource
at Episource
11 recruiters
Ahamed Riaz
Posted by Ahamed Riaz
Mumbai
5 - 12 yrs
₹18L - ₹30L / yr
Big Data
skill iconPython
skill iconAmazon Web Services (AWS)
Serverless
DevOps
+4 more

ABOUT EPISOURCE:


Episource has devoted more than a decade in building solutions for risk adjustment to measure healthcare outcomes. As one of the leading companies in healthcare, we have helped numerous clients optimize their medical records, data, analytics to enable better documentation of care for patients with chronic diseases.


The backbone of our consistent success has been our obsession with data and technology. At Episource, all of our strategic initiatives start with the question - how can data be “deployed”? Our analytics platforms and datalakes ingest huge quantities of data daily, to help our clients deliver services. We have also built our own machine learning and NLP platform to infuse added productivity and efficiency into our workflow. Combined, these build a foundation of tools and practices used by quantitative staff across the company.


What’s our poison you ask? We work with most of the popular frameworks and technologies like Spark, Airflow, Ansible, Terraform, Docker, ELK. For machine learning and NLP, we are big fans of keras, spacy, scikit-learn, pandas and numpy. AWS and serverless platforms help us stitch these together to stay ahead of the curve.


ABOUT THE ROLE:


We’re looking to hire someone to help scale Machine Learning and NLP efforts at Episource. You’ll work with the team that develops the models powering Episource’s product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations, clinical named entity recognition, improving patient health, clinical suspecting and information extraction from clinical notes.


This is a role for highly technical data engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, algorithms, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.


You will be responsible for setting an agenda to develop and ship data-driven architectures that positively impact the business, working with partners across the company including operations and engineering. You will use research results to shape strategy for the company and help build a foundation of tools and practices used by quantitative staff across the company.


During the course of a typical day with our team, expect to work on one or more projects around the following;


1. Create and maintain optimal data pipeline architectures for ML


2. Develop a strong API ecosystem for ML pipelines


3. Building CI/CD pipelines for ML deployments using Github Actions, Travis, Terraform and Ansible


4. Responsible to design and develop distributed, high volume, high-velocity multi-threaded event processing systems


5. Knowledge of software engineering best practices across the development lifecycle, coding standards, code reviews, source management, build processes, testing, and operations  


6. Deploying data pipelines in production using Infrastructure-as-a-Code platforms

 

7. Designing scalable implementations of the models developed by our Data Science teams  


8. Big data and distributed ML with PySpark on AWS EMR, and more!



BASIC REQUIREMENTS 


  1.  Bachelor’s degree or greater in Computer Science, IT or related fields

  2.  Minimum of 5 years of experience in cloud, DevOps, MLOps & data projects

  3. Strong experience with bash scripting, unix environments and building scalable/distributed systems

  4. Experience with automation/configuration management using Ansible, Terraform, or equivalent

  5. Very strong experience with AWS and Python

  6. Experience building CI/CD systems

  7. Experience with containerization technologies like Docker, Kubernetes, ECS, EKS or equivalent

  8. Ability to build and manage application and performance monitoring processes

Read more
Product and Service based company
Hyderabad, Ahmedabad
8 - 12 yrs
₹15L - ₹30L / yr
SQL server
Relational Database (RDBMS)
NOSQL Databases
Oracle
Database Design
+3 more

Job Description

Job Responsibilities

  • Design and implement robust database solutions including

    • Security, backup and recovery

    • Performance, scalability, monitoring and tuning,

    • Data management and capacity planning,

    • Planning, and implementing failover between database instances.

  • Create data architecture strategies for each subject area of the enterprise data model.

  • Communicate plans, status and issues to higher management levels.

  • Collaborate with the business, architects and other IT organizations to plan a data strategy, sharing important information related to database concerns and constrains

  • Produce all project data architecture deliverables..

  • Create and maintain a corporate repository of all data architecture artifacts.

 

Skills Required:

  • Understanding of data analysis, business principles, and operations

  • Software architecture and design Network design and implementation

  • Data visualization, data migration and data modelling

  • Relational database management systems

  • DBMS software, including SQL Server  

  • Database and cloud computing design, architectures and data lakes

  • Information management and data processing on multiple platforms 

  • Agile methodologies and enterprise resource planning implementation

  • Demonstrate database technical functionality, such as performance tuning, backup and recovery, monitoring.

  • Excellent skills with advanced features such as database encryption, replication, partitioning, etc.

  • Strong problem solving, organizational and communication skill.

Read more
Data Sutram
Ankit Das
Posted by Ankit Das
Mumbai, Gurugram
2 - 10 yrs
Best in industry
skill iconData Science
skill iconPython
skill iconData Analytics
Pipeline management
Cloud Computing
+7 more
Data Sutram, funded by India Infoline (IIFL), Indian Angel Network (IAN) and 100x.VC (led by Sanjay Mehta), is an alternate data company, using external data feeds to create every location's DNA that is utilized in various use cases like credit underwriting, location profiling, site selection, etc. It is one of the fastest-growing companies in the space of Artificial Intelligence & Location Analytics in India. As a data scientist, you get to work in our core product and work on various critical client use cases. As a data scientist, you get to explore new solutions and use cases collaborating with Business Analysts & fellow Data Scientists.

Roles and Responsibilities

  • Managing available resources such as hardware, data, and personnel so that deadlines are met.
  • Analyzing the ML and Deep Learning algorithms that could be used to solve a given problem and ranking them by their success probabilities
  • Exploring data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world
  • Defining validation framework and establish a process to ensure acceptable data quality criteria are met
  • Supervising the data acquisition and partnership roadmaps to create stronger product for our customers.
  • Defining feature engineering process to ensure usage of meaningful features given the business constraints which may vary by market
  • Device self-learning strategies through analysis of errors from the models
  • Understand business issues and context, devise a framework for solving unstructured problems and articulate clear and actionable solutions underpinned by analytics.
  • Manage multiple projects simultaneously while demonstrating business leadership to collaborate & coordinate with different functions to deliver the solutions in a timely, efficient and effective manner.
  • Manage project resources optimally to deliver projects on time; drive innovation using residual resources to create strong solution pipeline; provide direction, coaching & training, feedbacks to project team members to enhance performance, support development and encourage value aligned behaviour of the project team members; Provide inputs for periodic performance appraisal of project team members.

 

Preferred Technical & Professional expertise

  • Undergraduate Degree in Computer Science / Engineering / Mathematics / Statistics / economics or other quantitative fields
  • At least 2+ years of experience of managing Data Science projects with specializations in Machine Learning
  • In-depth knowledge of cloud analytics tools.
  • Able to drive Python Code optimization; ability review codes and provide inputs to improve the quality of codes
  • Ability to evaluate hardware selection for running ML models for optimal performance
  • Up to date with Python libraries and versions for machine learning; Extensive hands-on experience with Regressors; Experience working with data pipelines.
  • Deep knowledge of math, probability, statistics and algorithms; Working knowledge of Supervised Learning, Adversarial Learning and Unsupervised learning
  • Deep analytical thinking with excellent problem-solving abilities
  • Strong verbal and written communication skills with a proven ability to work with all levels of management; effective interpersonal and influencing skills.
  • Ability to manage a project team through effectively allocation of tasks, anticipating risks and setting realistic timelines for managing the expectations of key stakeholders
  • Strong organizational skills and an ability to balance and handle multiple concurrent tasks and/or issues simultaneously.
  • Ensure that the project team understand and abide by compliance framework for policies, data, systems etc. as per group, region and local standards
Read more
Number Theory
at Number Theory
3 recruiters
Nidhi Mishra
Posted by Nidhi Mishra
Gurugram
10 - 12 yrs
₹10L - ₹40L / yr
Artificial Intelligence (AI)
skill iconData Science
Windows Azure
Cloud Computing
skill iconJava
+2 more
Project Role Description:
  • Manages the delivery of large, complex Data Science projects using appropriate frameworks and collaborating with stake holders to manage scope and risk. Help the AI/ML Solution
  • Analyst to build solution as per customer need on our platform Newgen AI Cloud. Drives profitability and continued success by managing service quality and cost and leading delivery. Proactively support sales through innovative solutions and delivery excellence.
  •  
Work Experience:12+ years
Work location: Gurugram

Key Responsibilities:
1 Collaborate/contribute to all project phases, technical know to design, develop solutions and deploy at customer end.
2 End-to-end implementations i.e. gathering requirements, analysing, designing, coding, deployment to Production
3 Client facing role talking to client on regular basis to get requirement clarification
4. Lead the team

Core Tech Skills: Azure, Cloud Computing, Java/Scala, Python, Design Patterns and fair knowledge of Data Science. Fair Knowledge of Data Lake/DWH
Educational Qualification: Engineering graduate preferably Computer since graduate
Read more
Zyvka Global Services
at Zyvka Global Services
5 recruiters
Ridhima Sharma
Posted by Ridhima Sharma
Remote, Bengaluru (Bangalore)
5 - 12 yrs
₹1L - ₹30L / yr
Internet of Things (IOT)
skill iconJava
skill iconSpring Boot
SQL server
NOSQL Databases
+5 more
Lead Developer (IOT, Java, Azure)

Responsibilities

  • Design, plan and control the implementation of business solutions requests/demands.
  • Execution of best practices, design, and codification, guiding the rest of the team in accordance with it.
  • Gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements
  • Drive complex technical projects from planning through execution
  • Perform code review and manage technical debt
  • Handling release deployments and production issues
  • Coordinate stress tests, stability evaluations, and support for the concurrent processing of specific solutions
  • Participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews

Skills

  • Degree in Informatics Engineering, Computer Science, or in similar areas
  • Minimum of 5+ years’ work experience in the similar roles
  • Expert knowledge in developing cloud-based applications with Java, Spring Boot, Spring Rest, SpringJPA, and Spring Cloud
  • Strong understanding of Azure Data Services
  • Strong working knowledge of SQL Server, SQL Azure Database, No SQL, Data Modeling, Azure AD, ADFS, Identity & Access Management.
  • Hands-on experience in ThingWorx platform (Application development, Mashups creation, Installation of ThingWorx and ThingWorx components)
  • Strong knowledge of IoT Platform
  • Development experience in Microservices Architectures best practices and, Docker, Kubernetes
  • Experience designing /maintaining/tuning high-performance code to ensure optimal performance
  • Strong knowledge of web security practice
  • Experience working in Agile Development
  • Knowledge about Google CloudPlatform and Kubernetes
  • Good understanding of Git, source control procedures, and feature branching
  • Fluent in English - written and spoken (mandatory)
Read more
Pinghala
at Pinghala
1 recruiter
Ashwini Dhaipule
Posted by Ashwini Dhaipule
Pune
3 - 5 yrs
₹6L - ₹10L / yr
PowerBI
Data Visualization
Data architecture
Informatica PowerCenter
SQL
+5 more

Pingahla is recruiting business intelligence Consultants/Senior consultants who can help us with Information Management projects (domestic, onshore and offshore) as developers and team leads. The candidates are expected to have 3-6 years of experience with Informatica Power Center/Talend DI/Informatica Cloud and must be very proficient with Business Intelligence in general. The job is based out of our Pune office.

Responsibilities:

  • Manage the customer relationship by serving as the single point of contact before, during and after engagements.
  • Architect data management solutions.
  • Provide technical leadership to other consultants and/or customer/partner resources.
  • Design, develop, test and deploy data integration solutions in accordance with customer’s schedule.
  • Supervise and mentor all intermediate and junior level team members.
  • Provide regular reports to communicate status both internally and externally.
  • Qualifications:
  • A typical profile that would suit this position would be if the following background:
  • A graduate from a reputed engineering college 
  • An excellent I.Q and analytical skills and should be able to grasp new concepts and learn new technologies.
  • A willingness to work with a small team in a fast-growing environment.
  • A good knowledge of Business Intelligence concepts

 

Mandatory Requirements:

  • Knowledge of Business Intelligence
  • Good knowledge of at least one of the following data integration tools - Informatica Powercenter, Talend DI, Informatica Cloud
  • Knowledge of SQL
  • Excellent English and communication skills
  • Intelligent, quick to learn new technologies
  • Track record of accomplishment and effectiveness with handling customers and managing complex data management needs
     

 

Read more
Servian
at Servian
2 recruiters
sakshi nigam
Posted by sakshi nigam
Bengaluru (Bangalore)
2 - 8 yrs
₹10L - ₹25L / yr
Data engineering
ETL
Data Warehouse (DWH)
Powershell
DA
+7 more
Who we are
 
We are a consultant led organisation. We invest heavily in our consultants to ensure they have the technical skills and commercial acumen to be successful in their work.
 
Our consultants have a passion for data and solving complex problems. They are curious, ambitious and experts in their fields. We have developed a first rate team so you will be supported and learn from the best

About the role

  • Collaborating with a team of like-minded and experienced engineers for Tier 1 customers, you will focus on data engineering on large complex data projects. Your work will have an impact on platforms that handle crores of customers and millions of transactions daily.

  • As an engineer, you will use the latest cloud services to design and develop reusable core components and frameworks to modernise data integrations in a cloud first world and own those integrations end to end working closely with business units. You will design and build for efficiency, reliability, security and scalability. As a consultant, you will help drive a data engineering culture and advocate best practices.

Mandatory experience

    • 1-6 years of relevant experience
    • Strong SQL skills and data literacy
    • Hands-on experience designing and developing data integrations, either in ETL tools, cloud native tools or in custom software
    • Proficiency in scripting and automation (e.g. PowerShell, Bash, Python)
    • Experience in an enterprise data environment
    • Strong communication skills

Desirable experience

    • Ability to work on data architecture, data models, data migration, integration and pipelines
    • Ability to work on data platform modernisation from on-premise to cloud-native
    • Proficiency in data security best practices
    • Stakeholder management experience
    • Positive attitude with the flexibility and ability to adapt to an ever-changing technology landscape
    • Desire to gain breadth and depth of technologies to support customer's vision and project objectives

What to expect if you join Servian?

    • Learning & Development: We invest heavily in our consultants and offer internal training weekly (both technical and non-technical alike!) and abide by a ‘You Pass We Pay” policy.
    • Career progression: We take a longer term view of every hire. We have a flat org structure and promote from within. Every hire is developed as a future leader and client adviser.
    • Variety of projects: As a consultant, you will have the opportunity to work across multiple projects across our client base significantly increasing your skills and exposure in the industry.
    • Great culture: Working on the latest Apple MacBook pro in our custom designed offices in the heart of leafy Jayanagar, we provide a peaceful and productive work environment close to shops, parks and metro station.
    • Professional development: We invest heavily in professional development both technically, through training and guided certification pathways, and in consulting, through workshops in client engagement and communication. Growth in our organisation happens from the growth of our people.
Read more
MNC
at MNC
Agency job
via Fragma Data Systems by Priyanka U
Remote, Bengaluru (Bangalore)
2 - 6 yrs
₹6L - ₹15L / yr
Spark
Apache Kafka
PySpark
Internet of Things (IOT)
Real time media streaming

JD for IOT DE:

 

The role requires experience in Azure core technologies – IoT Hub/ Event Hub, Stream Analytics, IoT Central, Azure Data Lake Storage, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight / Databricks, SQL data warehouse.

 

You Have:

  • Minimum 2 years of software development experience
  • Minimum 2 years of experience in IoT/streaming data pipelines solution development
  • Bachelor's and/or Master’s degree in computer science
  • Strong Consulting skills in data management including data governance, data quality, security, data integration, processing, and provisioning
  • Delivered data management projects with real-time/near real-time data insights delivery on Azure Cloud
  • Translated complex analytical requirements into the technical design including data models, ETLs, and Dashboards / Reports
  • Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
  • Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
  • Successfully delivered large scale IOT data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
  • Experience in handling telemetry data with Spark Streaming, Kafka, Flink, Scala, Pyspark, Spark SQL.
  • Hands-on experience on containers and Dockers
  • Exposure to streaming protocols like MQTT and AMQP
  • Knowledge of OT network protocols like OPC UA, CAN Bus, and similar protocols
  • Strong knowledge of continuous integration, static code analysis, and test-driven development
  • Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
  • Must have excellent analytical and problem-solving skills
  • Delivered change management initiatives focused on driving data platforms adoption across the enterprise
  • Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
     

Roles & Responsibilities
 

You Will:

  • Translate functional requirements into technical design
  • Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core Azure services needed to fulfill the technical design
  • Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
  • Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
  • Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapse, or SQL
  • Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
  • Automate core activities to minimize the delivery lead times and improve the overall quality
  • Optimize platform cost by selecting the right platform services and architecting the solution in a cost-effective manner
  • Deploy Azure DevOps and CI CD processes
  • Deploy logging and monitoring across the different integration points for critical alerts

 

Read more
UnFound
at UnFound
1 recruiter
Ankur Pandey
Posted by Ankur Pandey
Mumbai
1 - 40 yrs
₹5L - ₹5L / yr
skill iconMachine Learning (ML)
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconPython
Microservices
+3 more
Does the current state of media frustrate you? Do you want to change the way we consume news? Are you a kickass machine learning practitioner and aspiring entrepreneur, who has opinions on world affairs as well? If so, continue reading! We at UnFound are developing a product which simplifies complex and cluttered news into simple themes, removes bias by showing all (& often unheard of) perspectives, and produce crisp summaries- all with minimal human intervention! We are looking for passionate and experienced machine learning ENGINEER/INTERN, *preferably* with experience in NLP. We want someone who can take initiatives. If you need to be micro-managed, this is NOT the role for you. 1. Demonstrable background in machine learning, especially NLP, information retrieval, etc. 2. Hands on with popular data science frameworks- Python, Jupyter, TensorFlow, PyTorch. 3. Implementation ready background in deep learning techniques like word embeddings, CNN, RNN/LSTM, etc. 4. Experience with productionizing machine learning solutions, especially ML powered mobile/ web-apps/ BOTs. 5. Hands on experience on AWS, and other cloud platforms. GPU experience is strongly preferred. 6. Thorough understanding of back-end concepts, and databases (SQL, Postgres, NoSQL, etc.) 7. Good Kaggle (or similar) scores, MOOC (Udacity, Coursera, fast.ai, etc.) preferred.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos