11+ Modbus Jobs in Pune | Modbus Job openings in Pune
Apply to 11+ Modbus Jobs in Pune on CutShort.io. Explore the latest Modbus Job opportunities across top companies like Google, Amazon & Adobe.
Are you ready to revolutionize the manufacturing landscape? We're on the hunt for a dynamic OT-IT Expert who’s not just skilled but passionate about transforming operational technology! Join us in optimizing our manufacturing shop floors by auditing and enhancing infrastructure and connectivity. Your mission? Identify those sneaky gaps in our data collection architecture and ensure our machines and sensors talk smoothly to the cloud. If you thrive in IIoT environments and can navigate the complexities of communication protocols like a pro, we want you on our team!
Key Responsibilities:
- Audit Like a Pro: Dive deep into our existing infrastructure and connectivity on the manufacturing shop floor.
- Gap Detective: Identify gaps in our architecture for seamless data collection from machines and sensors to cloud systems.
- Solution Architect: Design and propose the best-suited, cost-effective solution architectures based on your gap analysis.
- Proposal Wizard: Develop and prepare detailed project proposals that impress stakeholders.
- Supplier Negotiator: Identify and negotiate with suppliers for the required hardware and peripherals to bring your designs to life.
- Challenge Anticipator: Plan ahead for potential implementation challenges and devise clever solutions.
- Project Overlord: Oversee IIoT project implementations to ensure they stay on track and meet objectives.
- Data Accuracy Guardian: Ensure the accuracy of captured data by validating against actual shop floor values.
- Data Structurer: Develop and implement data format structuring and standard frameworks that enhance our operations.
- Quality Assurance Champion: Perform thorough Quality Assurance (QA) on collected data before handing it over to our data engineers.
Required Skills and Experience:
- Extensive experience in IIoT projects involving field communication protocols (Modbus, OPC, PROFINET, PROFIBUS).
- Strong knowledge of PLC programming with hands-on experience in multiple PLC brands (Siemens, Allen-Bradley, Mitsubishi).
- Proficient in auditing and analyzing OT infrastructure like a seasoned expert.
- Expertise in cloud connectivity and data collection systems.
- Creative thinker with the ability to design scalable and cost-effective IIoT architectures.
- Strong negotiation skills to manage vendor relationships effectively.
- Exceptional problem-solving and forward-thinking abilities.
- Solid understanding of manufacturing processes and shop floor operations.
- Proven experience in data validation, structuring, and quality assurance.
- Familiarity with data engineering concepts and the ability to liaise with data engineering teams.
Preferred Qualifications:
- Bachelor's degree in Instrumentation, Information Technology, Electrical Engineering, or a related field.
- Relevant certifications in Industrial Automation or IIoT technologies.
- 5+ years of experience in OT/IT convergence projects within manufacturing environments.
- Experience with data standardization and framework development in industrial settings.
Key Attributes:
- Strategic thinker with a hands-on approach who can get things done.
- Excellent communication skills to connect with both technical and non-technical stakeholders.
- Proactive problem-solver with a history of successful project deliveries.
- Adaptable to rapidly evolving technology landscapes, ready for anything!
- Meticulous attention to detail, especially in data accuracy and quality.
- Strong analytical skills for data validation and structure optimization.
Job Summary:
We are looking for a highly skilled and experienced Data Engineer with deep expertise in Airflow, dbt, Python, and Snowflake. The ideal candidate will be responsible for designing, building, and managing scalable data pipelines and transformation frameworks to enable robust data workflows across the organization.
Key Responsibilities:
- Design and implement scalable ETL/ELT pipelines using Apache Airflow for orchestration.
- Develop modular and maintainable data transformation models using dbt.
- Write high-performance data processing scripts and automation using Python.
- Build and maintain data models and pipelines on Snowflake.
- Collaborate with data analysts, data scientists, and business teams to deliver clean, reliable, and timely data.
- Monitor and optimize pipeline performance and troubleshoot issues proactively.
- Follow best practices in version control, testing, and CI/CD for data projects.
Must-Have Skills:
- Strong hands-on experience with Apache Airflow for scheduling and orchestrating data workflows.
- Proficiency in dbt (data build tool) for building scalable and testable data models.
- Expert-level skills in Python for data processing and automation.
- Solid experience with Snowflake, including SQL performance tuning, data modeling, and warehouse management.
- Strong understanding of data engineering best practices including modularity, testing, and deployment.
Good to Have:
- Experience working with cloud platforms (AWS/GCP/Azure).
- Familiarity with CI/CD pipelines for data (e.g., GitHub Actions, GitLab CI).
- Exposure to modern data stack tools (e.g., Fivetran, Stitch, Looker).
- Knowledge of data security and governance best practices.
Note : One face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.
BE/BTech/BS or equivalent
7+ years of experience in Java and Spring Boot
Strong fundamentals in data structure, algorithm, and object-oriented programming
4+ years of hands-on experience in designing, developing, and delivering large-scale (distributed) system
architecture with complex software design, high scalability and availability.
Extensive experience with technical leadership, defining visions/solutions and collaborating/driving to see
them to completion.
Excellent analytical and problem-solving skills
Experience with any RDBMS and strong SQL knowledge
Comfortable with Unix / Linux command line
Nice to have Skills
Experience with Big Data platforms like Hadoop / Hive / Presto
Experience with ML/AI frameworks like TensorFlow, H20, etc
Used Key Value stores or noSQL databases
Good understanding of docker and container platforms like Mesos and Kubernetes
Security-first architecture approach
Application benchmarking and optimization
We are looking for a React Native developer interested in building high-performance mobile apps on both the iOS and Android platforms. You will be responsible for architecting and building these applications, as well as coordinating with the teams responsible for other layers of the product infrastructure. Building a product is a highly collaborative effort, and as such, a strong team player with a commitment to perfection is required.
Responsibilities
• Build pixel-perfect, buttery-smooth UIs across both mobile platforms.
• Leverage native APIs for deep integrations with both platforms.
• Diagnose and fix bugs and performance bottlenecks for performance that feels native.
• Reach out to the open source community to encourage and help implement mission-critical software fixes—React Native moves fast and often breaks things.
• Maintain code and write automated tests to ensure the product is of the highest quality.
- Firm grasp of the JavaScript and TypeScript language and its nuances, including ES6+ syntax
- Knowledge of functional and object-oriented programming
- Ability to write well-documented, clean Javascript code
- Rock solid at working with third-party dependencies and debugging dependency conflicts
- Familiarity with native build tools, like Android Studio, IntelliJ
- We are only considering Pune-based candidates who can start immediately / within 30 days
Our client is the fastest growing company in the space of student mobility in Asia Pacific as per Financial times.
More about them
- Long-term accommodation booking platform for students (think booking.com for student housing).
- Helps 80M students worldwide, find and book full-time accommodations near their universities, without the hassle of negotiation, nonstandardized and cumbersome paperwork, and broken payment process.
- Leading student housing platform globally, with 1M+ student housing units listed in 6 countries and across 80 cities.
- Growing rapidly and targeting $400M in annual gross bookings value by 2022.
We, r, are looking to hire an Director Engineering , who is a thought leader,and a world-class engineer who has a proven record of creating an impact on businessand engineering with little or no help. You will own the technology vision andsignificantly contribute to building the engineering team and culture. You will partner wiith Engineering Manager 2
the product and business teams to understand product features and specifications,translate them into high-level and low-level designs, thereby facilitating the team in building mission critical applications.
Key responsibilities
To build next-generation web applications which are efficient, reliable, and scalable.
Explore and design dynamic and compelling consumer experiences.
Analyze system function and performance requirements to support design
concepts.
Actively participate in design and code reviews to build robust applications and prototypes.
Work closely with Product managers, designers, and the business team to
implement solutions.
Maintain a very high standard of product quality, ensuring a delightful experience for users.
Endorsing upcoming standards, launching, iterating, and making a difference.
Technical Skills
1)4+ years of experience, working with web technologies and building scalable and distributed software products and infrastructure
2)Exposure to complete product development cycles - From inception to production toscaling up, supporting new requirements, re-architectures
3)Sound knowledge of design patterns and practices for writing clean, linted,maintainable, and reusable code
4)Experience in working with both backend and frontend web technologies
5)Experience with infrastructure deployment and maintenance
6)Keeps an eye on new platform and ecosystem changes
Regards
Team Merito
We are looking for a Spark developer who knows how to fully exploit the potential of our Spark cluster. You will clean, transform, and analyze vast amounts of raw data from various systems using Spark to provide ready-to-use data to our feature developers and business analysts. This involves both ad-hoc requests as well as data pipelines that are embedded in our production
Requirements:
The candidate should be well-versed in the Scala programming language Should have experience in Spark Architecture and Spark Internals Exp in AWS is preferable Should have experience in the full life cycle of at least one big data application
Postgress developer cum database architect
5-7 years experience
Knowledge and experience, specifically PostGre
- SQL
- Query Optimization / Optimization strategies (in PostGre)
- AWS - RDS Relational Databse Service
- Optional: AWS – RDS API / PostGre API
Rules & Responsibilities:
- Design, implement and maintain all AWS infrastructure and services within a managed service environment
- Should be able to work on 24 X 7 shifts for support of infrastructure.
- Design, Deploy and maintain enterprise class security, network and systems management applications within an AWS environment
- Design and implement availability, scalability, and performance plans for the AWS managed service environment
- Continual re-evaluation of existing stack and infrastructure to maintain optimal performance, availability and security
- Manage the production deployment and deployment automation
- Implement process and quality improvements through task automation
- Institute infrastructure as code, security automation and automation or routine maintenance tasks
- Experience with containerization and orchestration tools like docker, Kubernetes
- Build, Deploy and Manage Kubernetes clusters thru automation
- Create and deliver knowledge sharing presentations and documentation for support teams
- Learning on the job and explore new technologies with little supervision
- Work effectively with onsite/offshore teams
Qualifications:
- Must have Bachelor's degree in Computer Science or related field and 4+ years of experience in IT
- Experience in designing, implementing, and maintaining all AWS infrastructure and services
- Design and implement availability, scalability, and performance plans for the AWS managed service environment
- Continual re-evaluation of existing stack and infrastructure to maintain optimal performance, availability, and security
- Hands-on technical expertise in Security Architecture, automation, integration, and deployment
- Familiarity with compliance & security standards across the enterprise IT landscape
- Extensive experience with Kubernetes and AWS(IAM, Route53, SSM, S3, EFS, EBS, ELB, Lambda, CloudWatch, CloudTrail, SQS, SNS, RDS, Cloud Formation, DynamoDB)
- Solid understanding of AWS IAM Roles and Policies
- Solid Linux experience with a focus on web (Apache Tomcat/Nginx)
- Experience with automation/configuration management using Terraform\Chef\Ansible or similar.
- Understanding of protocols/technologies like Microservices, HTTP/HTTPS, SSL/TLS, LDAP, JDBC, SQL, HTML
- Experience in managing and working with the offshore teams
- Familiarity with CI/CD systems such as Jenkins, GitLab CI
- Scripting experience (Python, Bash, etc.)
- AWS, Kubernetes Certification is preferred
- Ability to work with and influence Engineering teams
Job Brief:
We are looking for candidates that have experience in development and have performed CI/CD based projects. Should have a good hands-on Jenkins Master-Slave architecture, used AWS native services like CodeCommit, CodeBuild, CodeDeploy and CodePipeline. Should have experience in setting up cross platform CI/CD pipelines which can be across different cloud platforms or on-premise and cloud platform.
Job Location:
Pune.
Job Description:
- Hands on with AWS (Amazon Web Services) Cloud with DevOps services and CloudFormation.
- Experience interacting with customer.
- Excellent communication.
- Hands-on in creating and managing Jenkins job, Groovy scripting.
- Experience in setting up Cloud Agnostic and Cloud Native CI/CD Pipelines.
- Experience in Maven.
- Experience in scripting languages like Bash, Powershell, Python.
- Experience in automation tools like Terraform, Ansible, Chef, Puppet.
- Excellent troubleshooting skills.
- Experience in Docker and Kuberneties with creating docker files.
- Hands on with version control systems like GitHub, Gitlab, TFS, BitBucket, etc.
Preferred Skills:
- 12 to 15 years of experience in Product Development, Product Management and/or Project Management of Software Solutions in the Transportation / Railroad domain with preferred 4+ years on North American Railroads.Preference would be given to candidates with deep domain knowledge of North American Railroad.
- Hands-on experience in systems like Order/Waybill Management, Service Design, Rail Operations, Mechanical, Revenue Management, Intermodal, Locomotive management, Railroad freight management, process would be ideal.
- Partner with our functional and engineering teams to determine what can be delivered through balancing the need for new features, defects, and technical debt.
- Collaborate with Agile team members to effectively execute Scrum ceremonies through Sprint Planning, Retrospectives, and Daily Standups.
- Prioritize backlog in accordance with the understanding and validation of customer needs and associated outcomes.
- Experience and expertise in Agile methodologies (e.g., Scrum/Kanban/SaFe, etc.)
- Proficient in MPP, MS Office, with emphasis on Visio and MS Excel.
- Understanding of functional and data integration dependencies and issues.
- Should be able to handle team diligently.
- Familiarity with complex data and structures.
- Knowledge of physical and conceptual/logical data modelling concepts.
- A self-starter, can-do attitude i.e. what areas can I apply myself in and take ownership/responsibility and to be the go-to person in those areas.
- Flexibility needs to be comfortable and effective in various roles (wear different hats) as well as the ability to go broad and/or deep as needed.
- Must be comfortable with high levels of ambiguity and be willing to make reasoned decisions.
- A highly collaborative working style
- An ability to quickly absorb business processes and systems that implement those processes
- An ability to work with geographically distributed teams
- Ability to engage with diverse audience from business users to CXOs
- Proficient in presentations, written and verbal communication
- Ability to collaborate cross-functionally with individuals in a wide variety of disciplines and backgrounds and build strategic relationships.
- Excellent written and oral communication skills and the ability to interface with senior leadership.
- Travel – 30% to USA (*Only candidates with valid US B1 visa and available to travel onsite at a very short notice.
Qualifications
Any IT / Computers Graduate or Post Graduate



