11+ EMC Isilon Jobs in Pune | EMC Isilon Job openings in Pune
Apply to 11+ EMC Isilon Jobs in Pune on CutShort.io. Explore the latest EMC Isilon Job opportunities across top companies like Google, Amazon & Adobe.
• Responsible for performing daily operational tasks and maintaining availability at the customer site(s). Provisions solutions based on standardized procedures as outlined by Dell Technology best practice documentation. • Excellent troubleshooting skills with proven track record of successful incident management • Participates in the design and operational execution of the customer's disaster recovery process as required. Performs necessary storage infrastructure maintenance and necessary data migration, as required. • Replication sessions- Troubleshooting if any error encountered on the replication sessions. Establishing replication sessions for new sites. • Seeks advice or assistance from management and/or Technical Support as required during difficult customer situations. Works in conjunction with EMC colleagues to ensure effective resolution of technical issues encountered during implementations • Mentor junior staff • Good understanding of the Operating systems (Windows and Linux)
Essential Requirements:
• Specialist skills/ knowledge of VMAX, Powermax, VNXBlock, Unity, Isilon VPLEX and SAN Switches along with support knowledge of SRM along with CISCO, Brocade • Proficiency in hardware, software and/or operating systems environments. • Provide on call support. • Expert certification in relevant technology /product • Presentation and negotiating skills. • Organizational skills.
Experience:
• 5+ years of relevant experience
Job Title: Senior Python Developer – Product Engineering
Location: Pune, India
Experience Required: 3 to 7 Years
Employment Type: Full-time
Employment Agreement: Minimum 3 years (At the completion of 3 years, One Time Commitment Bonus will be applicable based on performance)
🏢 About Our Client
Our client is a leading enterprise cybersecurity company offering an integrated platform for Digital Rights Management (DRM), Enterprise File Sync and Share (EFSS), and Content-Aware Data Protection (CDP). With patented technologies for secure file sharing, endpoint encryption, and real-time policy enforcement, helps organizations maintain control over sensitive data — even after it leaves the enterprise perimeter.
🎯 Role Overview
We are looking for a skilled Python Developer with a strong product mindset and experience building scalable, secure, and performance-critical systems. You will join our core engineering team to enhance backend services powering DRM enforcement, file tracking, audit logging, and file sync engines.
This is a hands-on role for someone who thrives in a product-first, security-driven environment and wants to build technologies that handle terabytes of enterprise data across thousands of endpoints.
🛠️ Key Responsibilities
● Develop and enhance server-side services for DRM policy enforcement, file synchronization, data leak protection, and endpoint telemetry.
● Build Python-based backend APIs and services that interact with file systems, agent software, and enterprise infrastructure.
● Work on delta sync, file versioning, audit trails, and secure content preview/rendering services.
● Implement secure file handling, encryption workflows, and token-based access controls across modules.
● Collaborate with DevOps to optimize scalability, performance, and availability of core services across hybrid deployments (on-prem/cloud).
● Debug and maintain production-level services; drive incident resolution and performance optimization.
● Integrate with 3rd-party platforms such as LDAP, AD, DLP, CASB, and SIEM systems.
● Participate in code reviews, architecture planning, and mentoring junior developers.
📌 Required Skills & Experience
● 3+ years of professional experience with Python 3.x, preferably in enterprise or security domains.
● Strong understanding of multithreading, file I/O, inter-process communication, and low-level system APIs.
● Expertise in building RESTful APIs, schedulers, workers (Celery), and microservices.
● Solid experience with encryption libraries (OpenSSL, cryptography.io) and secure coding practices.
● Hands-on experience with PostgreSQL, Redis, SQLite, or other transactional and cache stores.
● Familiarity with Linux internals, filesystem hooks, journaling/logging systems, and OS-level operations.
● Experience with source control (Git), containerization (Docker/K8s), and CI/CD.
● Proven ability to write clean, modular, testable, and scalable code for production environments.
➕ Preferred/Bonus Skills
● Experience in EFSS, DRM, endpoint DLP, or enterprise content security platforms.
● Knowledge of file diffing algorithms (rsync, delta encoding) or document watermarking.
● Prior experience with agent-based software (Windows/Linux), desktop sync tools, or version control systems.
● Exposure to compliance frameworks (e.g., DPDP Act, GDPR, RBI-CSF) is a plus.
🌟 What We Offer
● Work on a patented and mission-critical enterprise cybersecurity platform
● Join a fast-paced team focused on innovation, security, and customer success
● Hybrid work flexibility with competitive compensation and growth opportunities
● Direct impact on product roadmap, architecture, and IP development
Work Mode: Hybrid
Need B.Tech, BE, M.Tech, ME candidates - Mandatory
Must-Have Skills:
● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.
● Minimum of 3 years of proven experience as a Data Engineer.
● Strong proficiency in Python programming language and SQL.
● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.
● Good comprehension and critical thinking skills.
● Kindly note Salary bracket will vary according to the exp. of the candidate -
- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA
- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA
- Experience more than 8 yrs - Salary upto 40 LPA
● You will work on developing new features in the product and modifying existing features to make logical harmony with new ones.
● Implementing design through code and testing with WB cases. Fixing issues in existing components. Participating in code reviews and reviews of WB cases.
● Preparing documentation required for product and deployment. Participating in build cycles and release processes. Participating in testing drives when necessary.
Keyskills Required:-
● Can learn fast and apply gained knowledge to solve practical problems. A hacker by mind and artist when it comes to design and coding.
● You should be fast learning on job JDBC, Apache Commons, Google APIs for Java, LDAP, Apache Logging, Lucene, Quartz Scheduler, Struts, XML, Bootstrap, JQuery, KnockoutJS, Cryptography libraries, Spring Framework.
About Company
They are group of companies based out of Pune, seeks to use innovative pedagogy and technology to bring the best global standards of skill development to the world’s under-privileged citizens.
Responsibilities
• Formulate business strategy with others in the executive team
• Design policies that align with overall strategy
• Implement efficient processes and standards
• Coordinate customer service operations and find ways to ensure customer retention
• Plan, monitor, and analyze key metrics for day-to-day operations to ensure efficient and timely completion of tasks
• Ensure compliance and data accuracy
• Oversee the implementation of technology solutions throughout the organization
• Manage contracts and relations with customers, vendors, partners and other stakeholders
• Oversee expenses and budgeting to help the organization optimize costs and benefits
• Mentor and motivate teams to achieve productivity and engagement
• Report on operational performance and suggest improvements
Requirements and skills
• Proven experience as Head of Operations, Operations Director or similar leadership role
• Familiarity with all business functions including HR, finance, supply chain and IT
• Good IT knowledge
• Good with numbers and financial planning – budgets and business plans
• Outstanding communication and negotiation skills
• Excellent organizational and leadership ability
• Analytical mind
• Problem-solving aptitude
• Bachelor’s degree (or equivalent) or Masters in business administration or related field
About us:
TAPPP(https://tappp.com/) is building the next-generation digital platform by leveraging cell-based architecture to integrate technologies like Artificial Intelligence, Rules, Workflows, Microservices, FaaS (Function as a Service), Micro-frontends, and Micro apps to create a highly extensible and cutting-edge technology platform that brings together sports fans with broadcasters, sports teams, and sportsbooks to create a marketplace for all aspects of sports and we are available across platforms via the Web, Mobile, Roku, and Tablets.
Building out this brand presents significant product and engineering challenges. At the center of solving those challenges is the TAPPP Product Engineering team which is responsible for the TAPPP product end to end.
TAPPP is led by a very able leadership team drawn from Industry leaders from companies like ESPN, Amazon, Blackhawk, Kargocard, Visa, and
many others.
The organization is flat, processes are minimal, individual responsibility is big, and there is an emphasis on keeping non-productive influences out of the everyday technical decision-making process. Upholding these philosophies will be imperative as we execute our aggressive plan of global expansion over the next 2 years.
Who are we looking for:
A coding enthusiast who loves writing elegant code and developing software systems.
As a senior java developer, you will be a part of the core product development team that is responsible for building high-performant components of the TAPPP platform.
Your responsibility:
- You will be responsible for designing, coding, reviewing, testing, and bug-fixing different modules of the software product that needs to work seamlessly across different environments.
- Write production-quality code in Java, J2EE, and Spring
- You will work in an agile team, working on the TAPPP revolutionary platform. You‘ll be using cutting-edge solutions (Spring Boot, Docker, Kafka, Redis, Continuous Delivery) for creating and maintaining high-load distributed services that are part of our messaging platform.
Mandatory technical skills:
- Hands-on experience with
- Java 1.7+
- RDBMS (MySQL/PostgreSQL)
- JPA (Hibernate or any other ORM framework)
- Spring Boot, Spring MVC, Spring Security
- Hands-on experience in writing extensible RESTful APIs
- Hands-on in Java development (all facets of development) with a sound understanding of OOAD.
- Should have excellent debugging, code review, and design review skills
- Should have a sound understanding of a Microservice based architecture
Good to have technical skills:
Kafka
GraphQL
Redis
AWS (ECS, Cloudwatch)
Other
- Strong independent contributor
- Comfortable working in a start-up environment
The position is based in Pune, India.
JOB DESCRIPTION
- 2 to 6 years of experience in imparting technical training/ mentoring
- Must have very strong concepts of Data Analytics
- Must have hands-on and training experience on Python, Advanced Python, R programming, SAS and machine learning
- Must have good knowledge of SQL and Advanced SQL
- Should have basic knowledge of Statistics
- Should be good in Operating systems GNU/Linux, Network fundamentals,
- Must have knowledge on MS office (Excel/ Word/ PowerPoint)
- Self-Motivated and passionate about technology
- Excellent analytical and logical skills and team player
- Must have exceptional Communication Skills/ Presentation Skills
- Good Aptitude skills is preferred
- Exceptional communication skills
Responsibilities:
- Ability to quickly learn any new technology and impart the same to other employees
- Ability to resolve all technical queries of students
- Conduct training sessions and drive the placement driven quality in the training
- Must be able to work independently without the supervision of a senior person
- Participate in reviews/ meetings
Qualification:
- UG: Any Graduate in IT/Computer Science, B.Tech/B.E. – IT/ Computers
- PG: MCA/MS/MSC – Computer Science
- Any Graduate/ Post graduate, provided they are certified in similar courses
ABOUT EDUBRIDGE
EduBridge is an Equal Opportunity employer and we believe in building a meritorious culture where everyone is recognized for their skills and contribution.
Launched in 2009 EduBridge Learning is a workforce development and skilling organization with 50+ training academies in 18 States pan India. The organization has been providing skilled manpower to corporates for over 10 years and is a leader in its space. We have trained over a lakh semi urban & economically underprivileged youth on relevant life skills and industry-specific skills and provided placements in over 500 companies. Our latest product E-ON is committed to complementing our training delivery with an Online training platform, enabling the students to learn anywhere and anytime.
To know more about EduBridge please visit: http://www.edubridgeindia.com/">http://www.edubridgeindia.com/
You can also visit us on https://www.facebook.com/Edubridgelearning/">Facebook , https://www.linkedin.com/company/edubridgelearning/">LinkedIn for our latest initiatives and products
- Design, create, test, and maintain data pipeline architecture in collaboration with the Data Architect.
- Build the infrastructure required for extraction, transformation, and loading of data from a wide variety of data sources using Java, SQL, and Big Data technologies.
- Support the translation of data needs into technical system requirements. Support in building complex queries required by the product teams.
- Build data pipelines that clean, transform, and aggregate data from disparate sources
- Develop, maintain and optimize ETLs to increase data accuracy, data stability, data availability, and pipeline performance.
- Engage with Product Management and Business to deploy and monitor products/services on cloud platforms.
- Stay up-to-date with advances in data persistence and big data technologies and run pilots to design the data architecture to scale with the increased data sets of consumer experience.
- Handle data integration, consolidation, and reconciliation activities for digital consumer / medical products.
Job Qualifications:
- Bachelor’s or master's degree in Computer Science, Information management, Statistics or related field
- 5+ years of experience in the Consumer or Healthcare industry in an analytical role with a focus on building on data pipelines, querying data, analyzing, and clearly presenting analyses to members of the data science team.
- Technical expertise with data models, data mining.
- Hands-on Knowledge of programming languages in Java, Python, R, and Scala.
- Strong knowledge in Big data tools like the snowflake, AWS Redshift, Hadoop, map-reduce, etc.
- Having knowledge in tools like AWS Glue, S3, AWS EMR, Streaming data pipelines, Kafka/Kinesis is desirable.
- Hands-on knowledge in SQL and No-SQL database design.
- Having knowledge in CI/CD for the building and hosting of the solutions.
- Having AWS certification is an added advantage.
- Having Strong knowledge in visualization tools like Tableau, QlikView is an added advantage
- A team player capable of working and integrating across cross-functional teams for implementing project requirements. Experience in technical requirements gathering and documentation.
- Ability to work effectively and independently in a fast-paced agile environment with tight deadlines
- A flexible, pragmatic, and collaborative team player with the innate ability to engage with data architects, analysts, and scientists


