Cutshort logo
Zyvka Global Services logo
Lead Developer (IOT, Java, Azure)
Lead Developer (IOT, Java, Azure)
Zyvka Global Services's logo

Lead Developer (IOT, Java, Azure)

Ridhima Sharma's profile picture
Posted by Ridhima Sharma
5 - 12 yrs
₹1L - ₹30L / yr
Remote, Bengaluru (Bangalore)
Skills
Internet of Things (IOT)
Java
Spring Boot
SQL server
NOSQL Databases
Docker
Kubernetes
Git
Microsoft Windows Azure
SQL Azure
Lead Developer (IOT, Java, Azure)

Responsibilities

  • Design, plan and control the implementation of business solutions requests/demands.
  • Execution of best practices, design, and codification, guiding the rest of the team in accordance with it.
  • Gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements
  • Drive complex technical projects from planning through execution
  • Perform code review and manage technical debt
  • Handling release deployments and production issues
  • Coordinate stress tests, stability evaluations, and support for the concurrent processing of specific solutions
  • Participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews

Skills

  • Degree in Informatics Engineering, Computer Science, or in similar areas
  • Minimum of 5+ years’ work experience in the similar roles
  • Expert knowledge in developing cloud-based applications with Java, Spring Boot, Spring Rest, SpringJPA, and Spring Cloud
  • Strong understanding of Azure Data Services
  • Strong working knowledge of SQL Server, SQL Azure Database, No SQL, Data Modeling, Azure AD, ADFS, Identity & Access Management.
  • Hands-on experience in ThingWorx platform (Application development, Mashups creation, Installation of ThingWorx and ThingWorx components)
  • Strong knowledge of IoT Platform
  • Development experience in Microservices Architectures best practices and, Docker, Kubernetes
  • Experience designing /maintaining/tuning high-performance code to ensure optimal performance
  • Strong knowledge of web security practice
  • Experience working in Agile Development
  • Knowledge about Google CloudPlatform and Kubernetes
  • Good understanding of Git, source control procedures, and feature branching
  • Fluent in English - written and spoken (mandatory)
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Zyvka Global Services

Founded :
2021
Type
Size
Stage :
Bootstrapped
About
N/A
Connect with the team
Profile picture
Shraddha Jain
Profile picture
Ridhima Sharma
Profile picture
vaishnavi shingane
Profile picture
Medhya Sinha
Profile picture
Amber Srivastava
Company social profiles
linkedinfacebook

Similar jobs

mazosol
kirthick murali
Posted by kirthick murali
Mumbai
10 - 20 yrs
₹30L - ₹58L / yr
Python
R Programming
PySpark
Google Cloud Platform (GCP)
SQL Azure

Data Scientist – Program Embedded 

Job Description:   

We are seeking a highly skilled and motivated senior data scientist to support a big data program. The successful candidate will play a pivotal role in supporting multiple projects in this program covering traditional tasks from revenue management, demand forecasting, improving customer experience to testing/using new tools/platforms such as Copilot Fabric for different purpose. The expected candidate would have deep expertise in machine learning methodology and applications. And he/she should have completed multiple large scale data science projects (full cycle from ideation to BAU). Beyond technical expertise, problem solving in complex set-up will be key to the success for this role. This is a data science role directly embedded into the program/projects, stake holder management and collaborations with patterner are crucial to the success on this role (on top of the deep expertise). 

What we are looking for: 

  1. Highly efficient in Python/Pyspark/R. 
  2. Understand MLOps concepts, working experience in product industrialization (from Data Science point of view). Experience in building product for live deployment, and continuous development and continuous integration. 
  3. Familiar with cloud platforms such as Azure, GCP, and the data management systems on such platform. Familiar with Databricks and product deployment on Databricks. 
  4. Experience in ML projects involving techniques: Regression, Time Series, Clustering, Classification, Dimension Reduction, Anomaly detection with traditional ML approaches and DL approaches. 
  5. Solid background in statistics, probability distributions, A/B testing validation, univariate/multivariate analysis, hypothesis test for different purpose, data augmentation etc. 
  6. Familiar with designing testing framework for different modelling practice/projects based on business needs. 
  7. Exposure to Gen AI tools and enthusiastic about experimenting and have new ideas on what can be done. 
  8. If they have improved an internal company process using an AI tool, that would be great (e.g. process simplification, manual task automation, auto emails) 
  9. Ideally, 10+ years of experience, and have been on independent business facing roles. 
  10. CPG or retail as a data scientist would be nice, but not number one priority, especially for those who have navigated through multiple industries. 
  11. Being proactive and collaborative would be essential. 

 

Some projects examples within the program: 

  1. Test new tools/platforms such as Copilo, Fabric for commercial reporting. Testing, validation and build trust. 
  2. Building algorithms for predicting trend in category, consumptions to support dashboards. 
  3. Revenue Growth Management, create/understand the algorithms behind the tools (can be built by 3rd parties) we need to maintain or choose to improve. Able to prioritize and build product roadmap. Able to design new solutions and articulate/quantify the limitation of the solutions. 
  4. Demand forecasting, create localized forecasts to improve in store availability. Proper model monitoring for early detection of potential issues in the forecast focusing particularly on improving the end user experience. 


Read more
Wonder Worth Solutions Pvt Ltd
Vellore
4 - 7 yrs
₹3.5L - ₹5L / yr
Machine Learning (ML)
Data Science
Python
Java
C++
+1 more

As a part of WWS, your expertise in machine learning helps us extract value from our data. You will lead all the processes from data collection, cleaning, and preprocessing, to training models and situating them to production. The ideal candidate will be passionate about artificial intelligence and stay up-to-date with the latest developments in the field.

What We Expect

  • A Bachelor's/Master's degree in IT, computer science, or an advanced related field is preferred.
  • At least 3+ years of experience working with ML libraries and packages.
  • Familiarity with coding and programming languages, including Python, Java, C++, and SAS.
  • Strong experience in programming and statistics.
  • Well-versed in Data Science and neural schematics in networking and software.
  • Flexibility in shifts is appreciated.

A Full Stack Developer’s Ideal Day At WWS

Design and Develop. The primary protocols include implementing machine learning algorithms and running AI systems experiments and tests. The Designing and development of machine learning systems along with performing statistical analyses falls under day to day activities of the developer.

Algorithm Assertion. The engineers act as critical members of the data science team as their tasks involve researching, asserting, and designing the artificial intelligence responsible for machine learning and maintaining and improving existing artificial intelligence systems.

Research and Development. To analyze large, complex datasets and extract insights and decide on the appropriate technique. research and implement best practices to improve the existing machine learning infrastructure. At most providing support to engineers and product managers in implementing machine learning as the product.

What You Can Expect

  • Full-time, salaried positions creamed with welfare programs.
  • Competitive salary and tailored training in the core space with recognition potential and annual bonus.
  • Periodic performance appraisals.
  • Attendance Incentives.
  • Working with the best and budding talent in the industry.
  • A conducive intangible environment with dynamic benefits.

Why Consider Machine Learning Engineer as a career with WWS?

With a very appealing work environment at WWS, our setting made it easier to build relationships with other staff members and clients. You may also have an opportunity to learn other aspects of environmental office work on the job, which can enhance your experience and qualifications.

Many businesses must proactively react to changing factors — like patterns of customer behavior or prices. Tracking model performance and retraining it once fresher data is available is key to success. This falls under the MLE range of responsibilities for which the requirement has been crucial for many organizations.

Please attach your resume and let us know through email your current address, phone number, and the best time to contact you by phone.

                                          Apply To this Job

Read more
Publicis Sapient
at Publicis Sapient
10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Noida
4 - 10 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.


Role & Responsibilities:

Job Title: Senior Associate L1 – Data Engineering

Your role is focused on Design, Development and delivery of solutions involving:

• Data Ingestion, Integration and Transformation

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time

• Build functionality for data analytics, search and aggregation


Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies

2.Minimum 1.5 years of experience in Big Data technologies

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security

7.Cloud data specialty and other related Big data technology certifications


Job Title: Senior Associate L1 – Data Engineering

Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes

Read more
Intelekt AI (Previously Techweirdo)
Mumbai
5 - 9 yrs
₹14L - ₹15L / yr
C#
.NET
Microsoft Windows Azure
Internet of Things (IOT)
Cloud Computing

Techweirdo delivers AI models & enterprise solutions, globally for mid to large-scale organizations.

We offer consultation, services, and products to holistically address the digital transformation goals of an enterprise.

We are currently hiring passionate, senior IoT cloud engineers on behalf of one of our large customers to help them find the best-fit talent and create technologically challenging, visually delightful, and easy-to-use digital products in a fast-paced environment.

 

Skills/ Role Requirements:

  • Good hands-on experience on Azure IoT Gateway/ IoT Hub development
  • Good hands-on experience to Azure functions, Azure event hub, Azure IoT Edge, and Cloud Platform Security
  • Strong knowledge of C# .NET
  • Industrial IoT experience is a must
  • Device Communication knowledge with exposure to different protocols is an advantage
  • Good communication skills as this position require consistent interaction with business stakeholders and other engineers
  • Hands-on experience in optimization, architecture, building scalable real-time data pipelines is a plus
  • 5 years plus relevant experience

 

 

Perks:

  1. Surrounded by curious learners: With a Growth Mindset as our core strength, we created a learning environment with curious tech learners.
  2. New challenges every day: There is no ordinary day at TechWeirdo, if you like solving problems, then this is the right place for you.
  3. Zero micro-management, limited supervision: We encourage our team to take on challenging tasks and solve complex problems by taking ownership of their tasks. We trust our team to take calculated risks.
  4. Great networking: You will be connected with c-suite executives of top organizations while working with our winning team.
  5. Building technology how you want, when you want: We welcome people who see things differently as they are the one who has the ability to change the world.
Read more
Codvoai
at Codvoai
1 recruiter
Akanksha kondagurla
Posted by Akanksha kondagurla
Hyderabad
3 - 5 yrs
₹3L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
J2EE
Spring Boot
+2 more

At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.

 

Key Responsibilities

  •   The Kafka Engineer is responsible for Designing and recommending the best  approach suited    for data movement to/from different sources using Apache/Confluent Kafka.
  •   Create topics, set up redundancy cluster, deploy monitoring tools, and alerts, and has good          knowledge of best practices.
  •   Develop and ensure adherence to published system architectural decisions and development       standards
  •   Must be comfortable working with offshore/global teams to deliver projects

 

 

Required Skills

 

  •  Good understanding of Event-based architecture, messaging frameworks and stream processing solutions using Kafka Messaging framework.
  •  3+ years hands-on experience working on Kafka connect using schema registry in a high-volume environment.
  •  Strong knowledge and exposure to Kafka brokers, zookeepers, KSQL, KStream and Kafka Control centre.
  •  Good experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connectors, JMS source connectors, Tasks, Workers, converters, and Transforms.
  •  Hands on experience with developing APIs and Microservices
  •  Solid expertise with Java
  •  Working knowledge on Kafka Rest proxy and experience on custom connectors using the Kafka core concepts and API.

 

Good to have: 

 

  •  Understanding of Data warehouse architecture and data modelling
  •  Good knowledge of big data ecosystem to design and develop capabilities to deliver solutions using CI/CD pipelines.
  •  Good understanding of other AWS services such as CloudWatch monitoring, scheduling, and automation services
  •  Strong skills in In-memory applications, Database Design, and Data Integration
  •  Ability to guide and mentor team members on using Kafka.

 

 

Experience: 3 to 8 Years 

Work timings: 2.30PM - 11.30PM
location - hyderabad

Read more
Bengaluru (Bangalore)
8 - 12 yrs
₹20L - ₹25L / yr
Data engineering
Spark
Big Data
Data engineer
Hadoop
+13 more
  • Play a critical role as a member of the leadership team in shaping and supporting our overall company vision, day-to-day operations, and culture.
  • Set the technical vision and build the technical product roadmap from launch to scale; including defining long-term goals and strategies
  • Define best practices around coding methodologies, software development, and quality assurance
  • Define innovative technical requirements and systems while balancing time, feasibility, cost and customer experience
  • Build and support production products
  • Ensure our internal processes and services comply with privacy and security regulations
  • Establish a high performing, inclusive engineering culture focused on innovation, execution, growth and development
  • Set a high bar for our overall engineering practices in support of our mission and goals
  • Develop goals, roadmaps and delivery dates to help us scale quickly and sustainably
  • Collaborate closely with Product, Business, Marketing and Data Science
  • Experience with financial and transactional systems
  • Experience engineering for large volumes of data at scale
  • Experience with financial audit and compliance is a plus
  • Experience building a successful consumer facing web and mobile apps at scale
Read more
Streamoid Technologies Pvt Ltd
Agency job
via HyreSpree by HyreSpree Team
Bengaluru (Bangalore)
4 - 6 yrs
₹4L - ₹20L / yr
Natural Language Processing (NLP)
PyTorch
Python
Java
Solr
+1 more
Skill Set:
  • 4+ years of experience Solid understanding of Python, Java and general software development skills (source code management, debugging, testing, deployment etc.).
  • Experience in working with Solr and ElasticSearch Experience with NLP technologies & the handling of unstructured text Detailed understanding of text pre-processing and normalisation techniques such as tokenisation, lemmatisation, stemming, POS tagging etc.
  • Prior experience in implementation of traditional ML solutions - classification, regression or clustering problem Expertise in text-analytics - Sentiment Analysis, Entity Extraction, Language modelling - and associated sequence learning models ( RNN, LSTM, GRU).
  • Comfortable working with deep-learning libraries (eg. PyTorch)
  • Candidate can even be a fresher with 1 or 2 years of experience IIIT, IIIT, Bits Pilani, top 5 local colleges are preferred colleges and universities.
  • A Masters candidate in machine learning.
  • Can source candidates from Mu Sigma and Manthan.
Read more
Angel One
at Angel One
4 recruiters
Andleeb Mujeeb
Posted by Andleeb Mujeeb
Remote only
2 - 6 yrs
₹12L - ₹18L / yr
Amazon Web Services (AWS)
PySpark
Python
Scala
Go Programming (Golang)
+19 more

Designation: Specialist - Cloud Service Developer (ABL_SS_600)

Position description:

  • The person would be primary responsible for developing solutions using AWS services. Ex: Fargate, Lambda, ECS, ALB, NLB, S3 etc.
  • Apply advanced troubleshooting techniques to provide Solutions to issues pertaining to Service Availability, Performance, and Resiliency
  • Monitor & Optimize the performance using AWS dashboards and logs
  • Partner with Engineering leaders and peers in delivering technology solutions that meet the business requirements 
  • Work with the cloud team in agile approach and develop cost optimized solutions

 

Primary Responsibilities:

  • Develop solutions using AWS services includiing Fargate, Lambda, ECS, ALB, NLB, S3 etc.

 

Reporting Team

  • Reporting Designation: Head - Big Data Engineering and Cloud Development (ABL_SS_414)
  • Reporting Department: Application Development (2487)

Required Skills:

  • AWS certification would be preferred
  • Good understanding in Monitoring (Cloudwatch, alarms, logs, custom metrics, Trust SNS configuration)
  • Good experience with Fargate, Lambda, ECS, ALB, NLB, S3, Glue, Aurora and other AWS services. 
  • Preferred to have Knowledge on Storage (S3, Life cycle management, Event configuration)
  • Good in data structure, programming in (pyspark / python / golang / Scala)
Read more
NCR (Delhi | Gurgaon | Noida)
2 - 12 yrs
₹25L - ₹40L / yr
Data governance
DevOps
Data integration
Data engineering
Python
+14 more
Data Platforms (Data Integration) is responsible for envisioning, building and operating the Bank’s data integration platforms. The successful candidate will work out of Gurgaon as a part of a high performing team who is distributed across our two development centers – Copenhagen and Gurugram. The individual must be driven, passionate about technology and display a level of customer service that is second to none.

Roles & Responsibilities

  • Designing and delivering a best-in-class, highly scalable data governance platform
  • Improving processes and applying best practices
  • Contribute in all scrum ceremonies; assuming the role of ‘scum master’ on a rotational basis
  •  Development, management and operation of our infrastructure to ensure it is easy to deploy, scalable, secure and fault-tolerant
  • Flexible on working hours as per business needs
Read more
Alien Brains
at Alien Brains
5 recruiters
Praveen Baheti
Posted by Praveen Baheti
Kolkata
2 - 7 yrs
₹5L - ₹8L / yr
Java
Python
Javascript
Machine Learning (ML)
Deep Learning
+2 more
A research lab with roots laid in innovation, we are looking for someone who can take reins of our AI based development think tank. Given the work ethics and results, the salary can be re-negotiated in 5 months.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos