11+ Real time media streaming Jobs in Pune | Real time media streaming Job openings in Pune
Apply to 11+ Real time media streaming Jobs in Pune on CutShort.io. Explore the latest Real time media streaming Job opportunities across top companies like Google, Amazon & Adobe.
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
- 4-10 years of experience in software development.
- At least 2 years of relevant work experience on large scale Data applications.
- Strong coding experience in Java is mandatory
- Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
- Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
- Should have good working experience on
- o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
- o Kafka
- o J2EE Frameworks (Spring/Hibernate/REST)
- o Spark Streaming or any other streaming technology.
- Strong coding experience in Java is mandatory
- Ability to work on the sprint stories to completion along with Unit test case coverage.
- Experience working in Agile Methodology
- Excellent communication and coordination skills
- Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
- Must be able to integrate quickly into the team and work independently towards team goals
- Take the complete responsibility of the sprint stories' execution
- Be accountable for the delivery of the tasks in the defined timelines with good quality.
- Follow the processes for project execution and delivery.
- Follow agile methodology
- Work with the team lead closely and contribute to the smooth delivery of the project.
- Understand/define the architecture and discuss the pros-cons of the same with the team
- Involve in the brainstorming sessions and suggest improvements in the architecture/design.
- Work with other team leads to get the architecture/design reviewed.
- Work with the clients and counter-parts (in US) of the project.
- Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune
The Role
Working in our Integrations Engineering team, you’ll play a key role in building products for healthcare . You’ll design and implement integrations with EMRs/EHRs and healthcare APIs, ensuring our technology fits seamlessly into clinical workflows.
What You’ll Do
- Build and maintain integrations between our products and healthcare systems, following interoperability standards like HL7, FHIR, SMART on FHIR.
- Extend Azure Health Data Services (AHDS) enabling conformance with Australian FHIR standards.
- Work with APIs and distributed systems that power product core features, ensuring performance, security, and scalability.
- Improve the reliability and accuracy of data pipelines.
- Champion best practices for building robust, maintainable systems in healthcare contexts.
What We’re Looking For
- 5+ years of professional software engineering experience.
- Prior experience integrating with healthcare systems (Cliniko, Halaxy, MediRecords, Epic, Cerner, athenahealth, Meditech, etc.).
- Familiarity with healthcare interoperability standards (FHIR, HL7, SMART on FHIR, etc.).
- Proficiency in a prominent programming language such as Python, C++, Rust, C#, Java, Javascript, Typescript etc., with experience building APIs and backend services (preferably with FastAPI).
- Experience working with datastores such as MongoDB and Redis
- Experience with integration design, including RESTful APIs, authentication/authorization (OAuth2, OpenID Connect), and event-driven systems.
Bonus points
- Previous experience in digital health startups or companies building EHR/EMR solutions.
- Knowledge of medical terminology or curiosity about speech-to-text systems.
- Full-stack experience with backend services (Python, FastAPI) and frontend frameworks (React.js) is a plus.
- Passion for improving healthcare and clinician experience.
MUST-HAVES:
- LLM, AI, Prompt Engineering LLM Integration & Prompt Engineering
- Context & Knowledge Base Design.
- Context & Knowledge Base Design.
- Experience running LLM evals
NOTICE PERIOD: Immediate – 30 Days
SKILLS: LLM, AI, PROMPT ENGINEERING
NICE TO HAVES:
Data Literacy & Modelling Awareness Familiarity with Databricks, AWS, and ChatGPT Environments
ROLE PROFICIENCY:
Role Scope / Deliverables:
- Scope of Role Serve as the link between business intelligence, data engineering, and AI application teams, ensuring the Large Language Model (LLM) interacts effectively with the modeled dataset.
- Define and curate the context and knowledge base that enables GPT to provide accurate, relevant, and compliant business insights.
- Collaborate with Data Analysts and System SMEs to identify, structure, and tag data elements that feed the LLM environment.
- Design, test, and refine prompt strategies and context frameworks that align GPT outputs with business objectives.
- Conduct evaluation and performance testing (evals) to validate LLM responses for accuracy, completeness, and relevance.
- Partner with IT and governance stakeholders to ensure secure, ethical, and controlled AI behavior within enterprise boundaries.
KEY DELIVERABLES:
- LLM Interaction Design Framework: Documentation of how GPT connects to the modeled dataset, including context injection, prompt templates, and retrieval logic.
- Knowledge Base Configuration: Curated and structured domain knowledge to enable precise and useful GPT responses (e.g., commercial definitions, data context, business rules).
- Evaluation Scripts & Test Results: Defined eval sets, scoring criteria, and output analysis to measure GPT accuracy and quality over time.
- Prompt Library & Usage Guidelines: Standardized prompts and design patterns to ensure consistent business interactions and outcomes.
- AI Performance Dashboard / Reporting: Visualizations or reports summarizing GPT response quality, usage trends, and continuous improvement metrics.
- Governance & Compliance Documentation: Inputs to data security, bias prevention, and responsible AI practices in collaboration with IT and compliance teams.
KEY SKILLS:
Technical & Analytical Skills:
- LLM Integration & Prompt Engineering – Understanding of how GPT models interact with structured and unstructured data to generate business-relevant insights.
- Context & Knowledge Base Design – Skilled in curating, structuring, and managing contextual data to optimize GPT accuracy and reliability.
- Evaluation & Testing Methods – Experience running LLM evals, defining scoring criteria, and assessing model quality across use cases.
- Data Literacy & Modeling Awareness – Familiar with relational and analytical data models to ensure alignment between data structures and AI responses.
- Familiarity with Databricks, AWS, and ChatGPT Environments – Capable of working in cloud-based analytics and AI environments for development, testing, and deployment.
- Scripting & Query Skills (e.g., SQL, Python) – Ability to extract, transform, and validate data for model training and evaluation workflows.
- Business & Collaboration Skills Cross-Functional Collaboration – Works effectively with business, data, and IT teams to align GPT capabilities with business objectives.
- Analytical Thinking & Problem Solving – Evaluates LLM outputs critically, identifies improvement opportunities, and translates findings into actionable refinements.
- Commercial Context Awareness – Understands how sales and marketing intelligence data should be represented and leveraged by GPT.
- Governance & Responsible AI Mindset – Applies enterprise AI standards for data security, privacy, and ethical use.
- Communication & Documentation – Clearly articulates AI logic, context structures, and testing results for both technical and non-technical audiences.
Skills Required:
- Experience of more than 4 to 6 years in software development.
- Very Strong Experience in Core Java.
- Excellent Java Programming skills Experience in Data Structures, Algorithms and Design Patterns.
- Strong in Problem solving, Analytical skill and logical thinking.
- Skill to be trained: java, spring, mango db, cassandra.
- Strong experience in Spring Boot, Restful API
- Looking for shorter notice period candidates only.
We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 4-10 years
Location : Pune
Mandatory Skills -
- Strong in ETL/SQL development
- Strong Data Warehousing skills
- Hands-on experience working with Unix/Linux
- Development experience in Enterprise Data warehouse projects
- Good to have experience working with Python, shell scripting
Opportunities -
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
http://www.datametica.com/">www.datametica.com
Job Location: Bangalore, Pune, Chennai, Mohali, Gurugram, Panchkula, or Dehradun
The candidate must be available to join within two weeks.
Job Description:
- 5 to 8 years of experience in Developing APIs and RESTful services using Node JS.
- Experience with AWS API Gateway
- Produce high-quality code and experience with security implementations Identifying application security risks and implementing security patches procedures.
- Implement and Improve application logging services
- Work with the product and design teams to understand end-user requirements, formulate definitions of done, and translate that into an effective technical solution.
- Work with the QA Team to develop testing protocols to identify and correct challenges.
- Must have good analytical, debugging, and problem-solving skills.
- Good communication skills.
Good to have : Oracle, WAS/Tomcat server knowledge and basic knowledge of shell scripting.
Minimum qualifications:
- Bachelor’s degree
Responsibilities:
- Assess go-to-market readiness, identify gaps in preparedness, and build plans to perform annual goals successfully.
- Work with teams to identify, qualify, and prioritize coverage for business opportunities. Participate in opportunity review meetings to provide insight into how to secure the technical win best.
- Maintain customer satisfaction.
- Resolve product problems affecting customer satisfaction.
- Build trust with customers and influential relationships.
Location: Pune
- 5+ years’ experience in developing applications for iOS using Swift 5.
- Hands-on experience in developing and integrating mobile applications for iPhone & iPad (both orientations)
- Knowledge of JSON/REST API, web services and related technologies.
- Experience with design Guidelines, UI and UX design.
- Prior experience in submitting and maintaining iOS Apps on Apple Store & TestFlight would be a plus
- Hands-on experience working with 3rd party SDKs, Libraries and APIs
- Experience with offline storage, threading, and performance tuning.
- Familiarity with cloud message APIs and push notifications.
- Familiar with Apple Human Interface Guidelines
- Dynamic design handling
- Gamification
- Strong problem-solving skills
- Should also be proficient in using Adaptive Layouts
Responsibilities and Duties
- Understand project, strategise development plans and deliver on time with utmost quality.
- Maintaining and supporting multiple projects and deadlines
- Volunteer in creating new apps and improvise existing apps.
- Stay in touch with market trends and new technologies.
Job Type: Full-time
Turtlemint is a technology platform (http://www.turtlemint.com/">www.turtlemint.com) that facilitates the entire process of researching, decision-making and buying insurance. Turtlemint is building a unique insure-tech platform which is enabling the transaction of complex products with a simple and intuitive interface. On the supply side, Turtlemint aggregates more than 25+ insurers and enabled three unique demand channels. First, direct to consumer channels where users buy insurance products. Second, a network of more than 35,000 agents across India who use the Turtlemint platform to sell insurance. Third, the SaaS version of the same platform for large financial institutes, consumer applications, etc. Now it is expanding into other financial products like mutual funds.
Turtlemint is founded by IIT/IIM graduates, ex-employees of top companies like ICICI Lombard, Yahoo, eBay, and Quikr. Our management team has to experience building and growing many successful technologies and consumer companies. Turtlemint is growing rapidly and it is already amongst the top three insurance platforms in India. We are a well-funded startup, backed by leading Venture Capitalists, and began our business operations in September 2015. We strongly believe that as a technology-led company we can truly disrupt the 'old economy' financial services businesses, a multi-billion dollar industry in India.
Turtlemint promises to offer an intellectually enriching and fun work environment, and an opportunity to work with smart and dedicated colleagues. Come make an impact on millions of users on an important but often ignored aspect of their life - financial security & freedom!
Job Description
Position: Engineering Manager
Role: Engineering Manager
Experience: 8-15 years
Location: Mumbai/Pune
Responsibilities
- Lead a team of engineers and product managers in the ideation and technical development of product
- Provide strategic and operational oversight for Enterprise software product development
- Work closely with business leaders to develop short and long-term strategies
- Manage business expectations, resolve conflicts, and keep businesses aligned
- Define the processes needed to achieve operational excellence in all areas, including project management and system reliability
- Experience scaling and managing 5-20 person teams
- Develop and drive execution on 6 months and 1-year road maps
- Drive innovation, establish new approaches in improving productivity
- Establish a metrics-based organization, develop key operational metrics and push for continuous improvement.
- Ensure system security, data integrity, and accuracy of financial records
Skillset and Experience
- 8+ years of experience in building Enterprise software
- 8+ years of experience with programming languages such as Java, PHP, Python, or C++
- 5+ years experience with agile systems development methodologies
- 4+ years of experience managing engineering teams including hiring/termination and performance management
- 3+ years of experience with either Spring Boot, Play Framework, Django, etc
Bachelors in Computer Science or a related technical field, or equivalent experience
Finally, and most importantly, drive, energy, and motivation to succeed with delivering great customer experiences
Bonus Qualification
- Experience developing financial products like insurance, mutual funds, etc
- Ability to understand the business logic
- Organizational and analytical skills
What you get:
- To work in an early-stage consumer internet start-up in disruptive space
- To directly work with the founding team of graduates from IIT/ IIM and experience at top internet brands like Yahoo/ eBay
- Great culture and work with like-minded colleagues
- Health Insurance for yourself and your family




