11+ Business operations Jobs in Pune | Business operations Job openings in Pune
Apply to 11+ Business operations Jobs in Pune on CutShort.io. Explore the latest Business operations Job opportunities across top companies like Google, Amazon & Adobe.
Roles & Responsibilities -
The business development executive will be primarily responsible for generating new business for the company alongside the Head of Global Business Development. Daily responsibilities will include:
- Lead generation via an existing network, B2B portals and other online/offline mediums
- Market research & analysis to figure out the obtainable market between the various geographies that we operate in
- Work collaboratively with the existing team and exhibit a solid pre-sales function, right from RFP, business analysis and submission of proposals.
- Identify sales tactics and close deals on a fast-paced level to gain maximum traction and growth in your success ladder
- Communicate with all the stakeholders and build a cohesive environment for teammates and the clients.
Pre-requisites -
1-2 years of experience working in an IT services-based organization
Profound knowledge of systems and new-age technologies used for delivering applications in current times
Graduate or Post Graduate from any Tier 1 institution across Pune/Mumbai or other locations
Excellent written, verbal and negotiation skills
Profound email & business language proficiency to communicate in the international market
Experience in servicing international clients across USA, UK, EU, Asia-Pacific & Middle-east
Flexibility to work across various time zones
This is a full-time position in the office. But hybrid with WFH twice a month can be allowed.
What do we offer?
Growth-led inclusive environment. Ideas to impact. Good compensation but great incentives.
The Role
Working in our Integrations Engineering team, you’ll play a key role in building products for healthcare . You’ll design and implement integrations with EMRs/EHRs and healthcare APIs, ensuring our technology fits seamlessly into clinical workflows.
What You’ll Do
- Build and maintain integrations between our products and healthcare systems, following interoperability standards like HL7, FHIR, SMART on FHIR.
- Extend Azure Health Data Services (AHDS) enabling conformance with Australian FHIR standards.
- Work with APIs and distributed systems that power product core features, ensuring performance, security, and scalability.
- Improve the reliability and accuracy of data pipelines.
- Champion best practices for building robust, maintainable systems in healthcare contexts.
What We’re Looking For
- 5+ years of professional software engineering experience.
- Prior experience integrating with healthcare systems (Cliniko, Halaxy, MediRecords, Epic, Cerner, athenahealth, Meditech, etc.).
- Familiarity with healthcare interoperability standards (FHIR, HL7, SMART on FHIR, etc.).
- Proficiency in a prominent programming language such as Python, C++, Rust, C#, Java, Javascript, Typescript etc., with experience building APIs and backend services (preferably with FastAPI).
- Experience working with datastores such as MongoDB and Redis
- Experience with integration design, including RESTful APIs, authentication/authorization (OAuth2, OpenID Connect), and event-driven systems.
Bonus points
- Previous experience in digital health startups or companies building EHR/EMR solutions.
- Knowledge of medical terminology or curiosity about speech-to-text systems.
- Full-stack experience with backend services (Python, FastAPI) and frontend frameworks (React.js) is a plus.
- Passion for improving healthcare and clinician experience.
Strong Behavioral Research / Behavioral Insights / Consumer Behavior Research Profiles
Mandatory (Experience 1): Must have 2+ years of experience in Behavioral Research, Consumer Behavior Analysis, Behavioral Science, or Market Research, focusing on understanding user motivations, decision patterns, and digital behavior
Mandatory (Experience 2): Must have hands-on experience conducting primary and secondary research, including qualitative and quantitative research methods such as interviews, ethnographic studies, surveys, behavioral experiments, and observational research
Mandatory (Experience 3) : Experience analyzing user behavior patterns, engagement trends, and digital interaction journeys to generate insights that influence product design, feature development, or platform experience improvements
Mandatory (Experience 4): Must have exposure to data analysis or data mining of user behavior datasets, combining research insights with behavioral data to identify patterns, anomalies, or opportunities for product innovation
Mandatory (Skills 1): Strong understanding of Behavioral Science principles such as Cognitive Bias, Cognitive Dissonance, System 1 & System 2 Thinking, Decision Science, or Consumer Psychology
Mandatory (Skills 2): Experience in testing concepts, prototypes, or product ideas through usability testing, concept validation, or behavioral experimentation to evaluate user response and product effectiveness
Mandatory (Skills 3): Ability to synthesize research findings into actionable insights, hypotheses, and recommendations for Product, Design, or Business teams
Mandatory (Skills 4): Basic understanding of UI/UX and digital product journeys, including how users interact with mobile apps or web platforms, though this is not a design role
Mandatory (Education): Background in Behavioral Science, Psychology, Sociology, Anthropology, Ethnography, Cognitive Science, or related fields (BA/MA/MSc/BSc)
Candidates from institutes with strong behavioral research exposure are preferred
Mandatory (Company): Candidates from B2C consumer product organizations where user behavior is closely tracked and analyzed, such as consumer tech, fintech, e-commerce, or digital platforms
Mandatory (Note):This is not a pure UX Researcher or Data Analyst role. The client is looking for Design Thinking–oriented Behavioral Researchers who combine behavioral science, research methodologies, and behavioral data analysis to generate insights
Key Responsibilities:
- Involve in the Design And Development Of Scalable Backend Applications Using Java (Spring Boot).
- Mentor And Guide A Team Of Developers To Ensure High-Quality Deliverables.
- Take Ownership Of Solution Architecture, Coding Standards, And Design Patterns.
- Develop And Manage Restful Apis And Integrate Third-Party Services.
- Collaborate With Front-End Teams, Qa, And Stakeholders To Align Technical Implementation With Business Goals.
- Oversee Deployments In Hybrid Cloud Environments In Coordination With Devops Teams.
- Conduct Code Reviews, Lead Design Discussions, And Manage Agile Development Processes (Scrum/Kanban).
- Monitor Application Performance And Drive Improvements Proactively.
- Troubleshoot And Resolve Complex Software Issues Across Systems And Services.
Required Skills:
- 6+ Years Of Professional Experience In Java Development.
- Strong Hands-On Expertise In Spring Boot And Microservices Architecture.
- Working Knowledge Of Node.Js And Javascript/Typescript.
- Experience With Rest Apis, Sql/Nosql Databases (Mysql, Postgresql, Mongodb).
- Familiar With Ci/Cd Pipelines, Git, And Modern Devops Practices.
- Proven Ability To Lead Distributed Teams And Manage Deliverables In A Remote/Hybrid Work Setup.
- Strong Communication, Leadership, And Problem-Solving Skills.
CricStox is a Pune startup building a trading solution in the realm of gametech x fintech.
We intend to build a sport-agnostic platform to allow trading in stocks of sportspersons under any sport
through our mobile & web-based applications.
We’re currently hiring a Frontend Engineer who will gather, refine specifications and requirements based
on technical needs and implement the same by using best software development practices.
Responsibilities?
● Mainly, but not limited to maintaining, expanding, and scaling our microservices/ app/ site.
● Integrate data from various back-end services and databases.
● Always be plugged into emerging technologies/industry trends and apply them into operations and
activities.
● Comfortably work and thrive in a fast-paced environment, learn rapidly and master diverse web
technologies and techniques.
● Juggle multiple tasks within the constraints of timelines and budgets with business acumen.
What skills do I need?
● Excellent programming skills and in-depth knowledge of modern HTML5, CSS3 (including
preprocessors like SASS).
● Excellent programming skills in Javascript or Typescript.
● Basic understanding in Nodejs with Nest framework or equivalent.
● Good programming skills in Vue 3.x with Composition API.
● Good understanding of using CSS frameworks like Quasar, Tailwind, etc.
● A solid understanding of how web applications work including security, session management, and
best development practices.
● Adequate knowledge of database systems, OOPs and web application development.
● Good functional understanding of containerising applications using Docker.
● Basic understanding of how cloud infrastructures like AWS, GCP work.
● Basic understanding of setting up Github CI/CD pipeline to automate Docker images building,
pushing to AWS ECR & deploying to the cluster.
● Proficient understanding of code versioning tools, such as Git (or equivalent).
● Proficient understanding of Agile methodology.
● Hands-on experience with network diagnostics, monitoring and network analytics tools.
● Basic knowledge of Search Engine Optimization process.
● Aggressive problem diagnosis and creative problem solving skills.
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.







