11+ Multiprotocol Label Switching (MPLS) Jobs in Pune | Multiprotocol Label Switching (MPLS) Job openings in Pune
Apply to 11+ Multiprotocol Label Switching (MPLS) Jobs in Pune on CutShort.io. Explore the latest Multiprotocol Label Switching (MPLS) Job opportunities across top companies like Google, Amazon & Adobe.
Company Overview:
Virtana delivers the industry’s only unified platform for Hybrid Cloud Performance, Capacity and Cost Management. Our platform provides unparalleled, real-time visibility into the performance, utilization, and cost of infrastructure across the hybrid cloud – empowering customers to manage their mission critical applications across physical, virtual, and cloud computing environments. Our SaaS platform allows organizations to easily manage and optimize their spend in the public cloud, assure resources are performing properly through real-time monitoring, and provide the unique ability to plan migrations across the hybrid cloud.
As we continue to expand our portfolio, we are seeking a highly skilled and hands-on Staff Software Engineer in backend technologies to contribute to the futuristic development of our sophisticated monitoring products.
Position Overview:
As a Staff Software Engineer specializing in backend technologies for Storage and Network monitoring in an AI enabled Data center as well as Cloud, you will play a critical role in designing, developing, and delivering high-quality features within aggressive timelines. Your expertise in microservices-based streaming architectures and strong hands-on development skills are essential to solve complex problems related to large-scale data processing. Proficiency in backend technologies such as Java, Python is crucial.
Work Location: Pune
Job Type: Hybrid
Key Responsibilities:
- Hands-on Development: Actively participate in the design, development, and delivery of high-quality features, demonstrating strong hands-on expertise in backend technologies like Java, Python, Go or related languages.
- Microservices and Streaming Architectures: Design and implement microservices-based streaming architectures to efficiently process and analyze large volumes of data, ensuring real-time insights and optimal performance.
- Agile Development: Collaborate within an agile development environment to deliver features on aggressive schedules, maintaining a high standard of quality in code, design, and architecture.
- Feature Ownership: Take ownership of features from inception to deployment, ensuring they meet product requirements and align with the overall product vision.
- Problem Solving and Optimization: Tackle complex technical challenges related to data processing, storage, and real-time monitoring, and optimize backend systems for high throughput and low latency.
- Code Reviews and Best Practices: Conduct code reviews, provide constructive feedback, and promote best practices to maintain a high-quality and maintainable codebase.
- Collaboration and Communication: Work closely with cross-functional teams, including UI/UX designers, product managers, and QA engineers, to ensure smooth integration and alignment with product goals.
- Documentation: Create and maintain technical documentation, including system architecture, design decisions, and API documentation, to facilitate knowledge sharing and onboarding.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
- 8+ years of hands-on experience in backend development, demonstrating expertise in Java, Python or related technologies.
- Strong domain knowledge in Storage and Networking, with exposure to monitoring technologies and practices.
- Experience is handling the large data-lakes with purpose-built data stores (Vector databases, no-SQL, Graph, Time-series).
- Practical knowledge of OO design patterns and Frameworks like Spring, Hibernate.
- Extensive experience with cloud platforms such as AWS, Azure or GCP and development expertise on Kubernetes, Docker, etc.
- Solid experience designing and delivering features with high quality on aggressive schedules.
- Proven experience in microservices-based streaming architectures, particularly in handling large amounts of data for storage and networking monitoring.
- Familiarity with performance optimization techniques and principles for backend systems.
- Excellent problem-solving and critical-thinking abilities.
- Outstanding communication and collaboration skills.
Why Join Us:
- Opportunity to be a key contributor in the development of a leading performance monitoring company specializing in AI-powered Storage and Network monitoring.
- Collaborative and innovative work environment.
- Competitive salary and benefits package.
- Professional growth and development opportunities.
- Chance to work on cutting-edge technology and products that make a real impact.
If you are a hands-on technologist with a proven track record of designing and delivering high-quality features on aggressive schedules and possess strong expertise in microservices-based streaming architectures, we invite you to apply and help us redefine the future of performance monitoring.
Job Details
- Job Title: Lead I - Data Engineering
- Industry: Global digital transformation solutions provider
- Domain - Information technology (IT)
- Experience Required: 6-9 years
- Employment Type: Full Time
- Job Location: Pune
- CTC Range: Best in Industry
Job Description
Job Title: Senior Data Engineer (Kafka & AWS)
Responsibilities:
- Develop and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and AWS services.
- Configure and manage Kafka connectors, ensuring seamless data flow and integration across systems.
- Demonstrate strong expertise in the Kafka ecosystem, including producers, consumers, brokers, topics, and schema registry.
- Design and implement scalable ETL/ELT workflows to efficiently process large volumes of data.
- Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and Glue.
- Implement robust monitoring, testing, and observability practices to ensure reliability and performance of data platforms.
- Uphold data security, governance, and compliance standards across all data operations.
Requirements:
- Minimum of 5 years of experience in Data Engineering or related roles.
- Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
- Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
- Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
- Excellent problem-solving, communication, and collaboration skills.
- Flexibility to write production-quality code in both Python and Java as required.
Skills: Aws, Kafka, Python
Must-Haves
Minimum of 5 years of experience in Data Engineering or related roles.
Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
Excellent problem-solving, communication, and collaboration skills.
Flexibility to write production-quality code in both Python and Java as required.
Skills: Aws, Kafka, Python
Notice period - 0 to 15days only
Google Data Engineer - SSE
Position Description
Google Cloud Data Engineer
Notice Period: Immediate to 30 days serving
Job Description:
We are seeking a highly skilled Data Engineer with extensive experience in Google Cloud Platform (GCP) data services and big data technologies. The ideal candidate will be responsible for designing, implementing, and optimizing scalable data solutions while ensuring high performance, reliability, and security.
Key Responsibilities:
• Design, develop, and maintain scalable data pipelines and architectures using GCP data services.
• Implement and optimize solutions using BigQuery, Dataproc, Composer, Pub/Sub, Dataflow, GCS, and BigTable.
• Work with GCP databases such as Bigtable, Spanner, CloudSQL, AlloyDB, ensuring performance, security, and availability.
• Develop and manage data processing workflows using Apache Spark, Hadoop, Hive, Kafka, and other Big Data technologies.
• Ensure data governance and security using Dataplex, Data Catalog, and other GCP governance tooling.
• Collaborate with DevOps teams to build CI/CD pipelines for data workloads using Cloud Build, Artifact Registry, and Terraform.
• Optimize query performance and data storage across structured and unstructured datasets.
• Design and implement streaming data solutions using Pub/Sub, Kafka, or equivalent technologies.
Required Skills & Qualifications:
• 8-15 years of experience
• Strong expertise in GCP Dataflow, Pub/Sub, Cloud Composer, Cloud Workflow, BigQuery, Cloud Run, Cloud Build.
• Proficiency in Python and Java, with hands-on experience in data processing and ETL pipelines.
• In-depth knowledge of relational databases (SQL, MySQL, PostgreSQL, Oracle) and NoSQL databases (MongoDB, Scylla, Cassandra, DynamoDB).
• Experience with Big Data platforms such as Cloudera, Hortonworks, MapR, Azure HDInsight, IBM Open Platform.
• Strong understanding of AWS Data services such as Redshift, RDS, Athena, SQS/Kinesis.
• Familiarity with data formats such as Avro, ORC, Parquet.
• Experience handling large-scale data migrations and implementing data lake architectures.
• Expertise in data modeling, data warehousing, and distributed data processing frameworks.
• Deep understanding of data formats such as Avro, ORC, Parquet.
• Certification in GCP Data Engineering Certification or equivalent.
Good to Have:
• Experience in BigQuery, Presto, or equivalent.
• Exposure to Hadoop, Spark, Oozie, HBase.
• Understanding of cloud database migration strategies.
• Knowledge of GCP data governance and security best practices.
Role & responsibilities
- Develop new Leads of Industries who can install solar
- Develop strong relationships with the different stakeholders in these industries
- Generate proposals, offers
- Conduct negotiations with prospective customers
- Close orders with prospective customers
- Work with the Products team to better mold the product as per the customers requirements
- Hand hold a customer in their transition to renewable energy
Preferred candidate profile
- Past work experience as a sales executive / manager in an EPC company / project developer
- First hand knowledge of Rooftop Solar
- Strong Industry connects to be able to work independently
- Bachelors Degree in associated field
- Comfortable with travelling over night to visit the site if required
Perks and benefits
- Pay above industry standards
- Very attractive Performance linked Bonus
- Attractive ESOPs
- 100% Ownership of the task
- Flexible, employee centric work culture
- Massive Growth Potential
We are hiring for Salesforce Community Cloud
8+ years of experience
3 yrs experience in Salesforce Community development.
Expert in Lightning component development
Should be able to set up and manage communities
Should be able to customize Communities
Should have experience on salesforce Integrations
Should some exposure to Mulesoft Any point Platform
Collaborating with the client to understand their business objectives and shepherding them through the intricacies of the Salesforce platform. Analyze and review of business, functional, and technical requirements
Design and create application architecture and develop solutions that achieve the customer’s objectives that are secure, scalable and maintainable; translating business requirements into systems, services, and solutions. Implementing automation to improve processes
Participate in project team meetings and communicate effectively with business and technical team members and stakeholders, include across team and organizational boundaries
Reporting project status as required
#hiring #community #salesforce #salesforcecommercecloud #salesforceconsultant #salesforcecommunity #sfdc
We have an urgent requirements of Big Data Developer profiles in our reputed MNC company.
Location: Pune/Bangalore/Hyderabad/Nagpur
Experience: 4-9yrs
Skills: Pyspark,AWS
or Spark,Scala,AWS
or Python Aws
We are looking for Full Stack Developers. Candidate should be hands on with front end and backend Java technologies.( 2-10 years of exp).Designation will be Java Developer, Sr.Java Developer and Java Lead basis the years of experience.
We are Looking for below skills,
Must Have’s
- JSP, JS, Servlets, CSS, HTML, JQUERY
- MVC, Struts, Application Servers ( JBOSS/WebSphere), REST Services
- OOPS Concepts & Core JAVA
- Basic SQL
- Good Analytical Skills
Good to Have
- REACT, ANGULAR
- Explain Plan/ Query Tuning
- Jenkins, Maven
About Mudrantar Solutions Private Limited
Mudrantar Solutions Pvt. Ltd. is a wholly owned subsidiary of the US based startup, Mudrantar Corporation. Mudrantar is a well-funded startup focused on disruptive changes in the Accounting Software for Small, Medium as well as Large businesses in India. Our state-of-the-art OCR + Machine Learning Technology allows customers to simply take photos and our software does the rest of the heavy lifting. Our strategy for CAs, CS and Tax Practitioners is realized through web access for our customers to manage their practice and also manage client communication through freely available mobile app. We also offer data entry automation services through AI/ML platform.
HR Associate
As HR Associate in the IT Startup, you will use your unique blend of HR and Communication skills to recruit top talent in IT Industry as well as retain employees. In this role, you will be responsible for obtaining and recording HR information, managing the HR database, and assisting company employees with enrollment procedures and HR-related issues.
Position
- Full time employment
Location
- Hyderabad or Pune (preferred)
- Any location in India
Requirements
- Recruitment end to end
- On Boarding, Induction
- Learning & Development
- Employee Engagement
- Personal Record Management in ERP
- Attendance & Payroll Assistance
- Employee Retention, Exit Interviews & formalities
Salary:
₹350,000.00 - ₹500,000.00 per year
Benefits:
- Health insurance
- Paid sick time
- Paid time off
- Work from home
Education:
- Master's (Preferred)
Experience:
- Human Resources Generalist: 2+ years (Preferred)
- recruitment: 2+ years (Preferred)
- HRIS: 2+ years (Preferred)
JD for NodeJS
Mandatory Skills - Nodejs, Javascript, Express.js, MongoDB, Data Structures, Algorithms.
Please find the JD below:-
- Expertise in Node.js Web frameworks like Meteor, Express, and Kraken.JS
- Expertise in building highly scalable web services using Node.js, Create REST API with the help of Node middleware
- Deep understanding of REST and API design
- Experience designing APIs for consistency, simplicity, and extensibility.
- Expertise with JavaScript testing frameworks like Jasmine, Quit, Mocha, Sinnon and Chai.
- Expertise with build tools like Web pack, gulp, and grunt.
- Integration of various application components
- Experience in various phases of Software Development Life Cycle (SDLC) such as requirements analysis, design, and implementation in an agile environment, etc.
About Us :-
Mobile programming LLC is a US based digital transformation company. We help enterprises transform ideas into innovative and intelligent solutions, governing the Internet of Things, Digital Commerce, Business Intelligence Analytics and Cloud Programming. Bring your challenges to us, we will give you the smartest solutions. From conceptualizing and engineering to advanced manufacturing, we help customers build and scale products fit for the global marketplace.
Mobile programming LLC has offices located in Los Angeles, San Jose, Glendale, San Diego, Phoenix, Plano, New York, Fort Lauderdale and Boston. Mobile programming is SAP Preferred Vendor, Apple Adjunct Partner, Google Empaneled Mobile Vendor and Microsoft Gold Certified Partner.
Moreover We have our presence in India with 7 locations i.e Gurgaon, Mohali, Panchkula, Pune, Bangalore, Dehradun & Chennai.
For more information please visit to our website link below:-
https://www.mobileprogramming.com/">https://www.mobileprogramming.com



