11+ Scientific Computing Jobs in Hyderabad | Scientific Computing Job openings in Hyderabad
Apply to 11+ Scientific Computing Jobs in Hyderabad on CutShort.io. Explore the latest Scientific Computing Job opportunities across top companies like Google, Amazon & Adobe.
As a Lead Solutions Architect at Aganitha, you will:
* Engage and co-innovate with customers in BioPharma R&D
* Design and oversee implementation of solutions for BioPharma R&D * Manage Engineering teams using Agile methodologies
* Enhance reuse with platforms, frameworks and libraries
Applying candidates must have demonstrated expertise in the following areas:
1. App dev with modern tech stacks of Python, ReactJS, and fit for purpose database technologies
2. Big data engineering with distributed computing frameworks
3. Data modeling in scientific domains, preferably in one or more of: Genomics, Proteomics, Antibody engineering, Biological/Chemical synthesis and formulation, Clinical trials management
4. Cloud and DevOps automation
5. Machine learning and AI (Deep learning)
On-site at Cloudnine Hospital
📍 Location: Bangalore, Delhi, Ghaziabad, Hyderabad
🏢 Company: Cryoviva Biotech Pvt Ltd
🕒 Type: Full-time | On-site | Inside Sales
🎓 Eligibility: B.Sc / M.Sc in Biotechnology, Microbiology, Biochemistry or related life sciences (Freshers welcome!)
Key Responsibilities:
Educate pregnant women and families about stem cell preservation and its future health benefits.
Counsel clients, answer their questions, and help them understand the scientific and emotional value of the service.
Coordinate with doctors, gynecologists, and hospital staff to maintain smooth communication and support.
Build Relationships with clients and hospital teams to promote awareness and trust in our services.
Drive Enrollments by guiding and assisting parents through the decision-making and registration process
We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in
building large-scale data pipelines, real-time streaming solutions, and batch/stream
processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and
AWS Big Data services, with hands-on experience in implementing CDC (Change Data
Capture) pipelines and integrating multiple data sources and sinks.
Responsibilities
- Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
- Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
- Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
- Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
- Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
- Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
- Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
- Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
- Implement monitoring, logging, and alerting for critical data pipelines.
- Follow best practices for data security, compliance, and cost optimization in cloud environments.
Required Skills & Experience
- Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
- Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
- Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
- CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
- AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
- ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
- Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
- Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
- Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
- Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
- Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
- Functions for workflow orchestration.
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Experience in large-scale data lake / lake house architectures.
- Knowledge of data warehousing concepts and query optimisation.
- Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
- Exposure to ML/AI data pipelines is a plus.
Tools & Technologies (must-have exposure)
- Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
- Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
- Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
- Programming & Scripting: Python, SQL, Bash
- Orchestration: Airflow / Step Functions
- Version Control & CI/CD: Git, Jenkins/CodePipeline
- Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi
WHO YOU ARE
To be successful in this role, you’ll need to have the following skills:
· Love for coding: A fanatic about writing beautiful and scalable code.
· A sense of analytics: Strong analytical and troubleshooting skills. Should be resourceful, innovative and inventive.
· Dynamic: Should be comfortable in dealing with lots of moving pieces. Have exquisite attention to detail, and are comfortable learning new technologies and systems.
· Team player: Knack for influencing without being authoritative. Pitch in wherever the team needs help, from writing blog posts to supporting customers.
· Accountability: High sense of ownership for your code and relentlessness to deliver projects with high business impact.
KEY QUALIFICATIONS
· BE/BTech in Computer Science or related field.
· 5+ years of professional production and development experience with leading-edge server and database technologies (e.g., Python, Java, Node.js, Scala, Spring Boot, MySQL, and NoSQL databases).
KEY SKILLS
· Strong computer system analysis and design skills in current methodologies and patterns Experience with professional production cloud (AWS preferred).
· Experience with RESTful Services and APIs.
Backend Architect:
Technology: node js, DynamoDB / Mongo DB
Roles:
- Design & implement Backend Services.
- Able to redesign the architecture.
- Designing & implementation of application in MVC & Microservice.
- 9+ years of experience developing service-based applications using Node.js.
- Expert-level skills in developing web applications using JavaScript, CSS and HTML5.
- Experience working on teams that practice BDD (Business Driven Development).
- Understanding of micro-service architecture and RESTful API integration patterns.
- Experience using Node.js for automation and leveraging NPM for package management
- Solid Object Oriented design experience, and creating and leveraging design patterns.
- Experience working in a DevOps/Continuous Delivery environment and associated toolsets (i.e. Jenkins, Puppet etc.)
Desired/Preferred Qualifications :
- Bachelor's degree or equivalent experience
- Strong problem solving and conceptual thinking abilities
- Desire to work in a collaborative, fast-paced, start-up like environment
- Experience leveraging node.js frameworks such as Express.
- Experience with distributed source control management, i.e. Git
Oracle Cloud HCM Technical
Experience- 6 – 12 years
Location- Hyderabad, Chennai, Bangalore, Pune, Kolkata, Noida, Gurgaon, Mumbai
1.Practitioner/expert level knowledge in at least 2 of the following tools: HDL, HCM Extracts,
BIP Reports, OTBI Reports, REST Web services, Fast Formula
2.In depth knowledge in at least two of HCM Cloud Modules:
∙Core HR
∙Absences
∙Benefits
∙Payroll
∙Workforce Compensation
∙OTL
∙Taleo
∙OLC
∙Oracle recruit.
1.Cloud -certification will be an advantage
2.Very good communication skills and excellent problem-solving skills.
3.Direct client interaction experience will be an advantage
- You would be responsible for sales of residential properties.
- You would be required to follow all standard operating procedures for effective sales.
- You would need to attend all customer queries and would also need to ensure that all the customer queries and complaints are forwarded to the right individual, well in time to enable him/ her to address and resolve the issue at the earliest.
- You would have to collect & compile customer data on timely basis.
- You will need to ensure that all reports are duly completed in time with efficiency.
- You will also need to make corporate visits & presentations.
- You will need to participate in the survey conducted by the sales department with regards to the market & competitors.
- You would have to make every effort to maximize both present and long term sales & gross profits
- You will need to call and advise our prospective customers on a daily basis. These calls would be made using telephone, mobile handset or automatic dialer system provided by the company. By the means of these calls you will need to influence the potential customers to visit our site. Also, with proper follow ups one will need to ensure that these customers eventually buy their dream home from Pacifica.
• Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)
• should have good hands-on Spark (spark with java/PySpark)
• Hive
• must be good with SQL's(spark SQL/ HiveQL)
• Application design, software development and automated testing
Environment Experience:
• Experience with implementing integrated automated release management using tools/technologies/frameworks like Maven, Git, code/security review tools, Jenkins, Automated testing, and Junit.
• Demonstrated experience with Agile or other rapid application development methods
• Cloud development (AWS/Azure/GCP)
• Unix / Shell scripting
• Web services , open API development, and REST concepts
At Nspira we want to develop an ecosystem of apps to dominate the personal growth market in the education sector. Being one of our talented mobile developers, you'll have an opportunity to learn and grow and be a key part of creating our products.
Engineering Culture :
- We run functional teams based on the SCRUM methodology and are a fairly Agile environment, with 2-week sprints and teams with a minimum of 2 developers (preferably 3) plus a Platform Owner.
- Sprints cannot be interrupted and work is planned out well in advance to keep stress down to a minimum.
Other stuff we do/you will encounter:
- Line by line code reviews for every pull request on Gitlab maintains code quality, and keep everyone in the loop & learning.
- At least 2 people working together on every project (collaborating not pairing)
Required skills:
- Swift is strongly preferred
- Experience with clean architecture, MVVM, and other design patterns
- Master Storyboards and auto layouts
- Familiar with RESTful web service
- Experience with Static Frameworks
- Great logic and problem-solving skills
- Practical understanding of Continuous Integration and Delivery
Your application must include:
- A resume in pdf format. Include into your resume the links to software, mobile apps, your coding samples so we can see proof of your talents.
We are looking for an experienced engineer with superb technical skills. You will primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions. The successful candidate will be curious, creative, ambitious, self motivated, flexible, and have a bias towards taking action. As part of the early engineering team, you will have a chance to make a measurable impact in future of Thinkdeeply as well as having a significant amount of responsibility.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Experience
12+ Years
Location
Hyderabad
Skills
Bachelors/Masters/Phd in CS or equivalent industry experience
10+ years of industry experience in java related frameworks such as Spring and/or Typesafe
Experience with scripting languages. Python experience highly desirable. 5+ Industry experience in python
Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
Demonstrated expertise of building and shipping cloud native applications
Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
Experience in API development using Swagger
Strong expertise with containerization technologies including kubernetes, docker-compose
Experience with cloud platform services such as AWS, Azure or GCP.
Implementing automated testing platforms and unit tests
Proficient understanding of code versioning tools, such as Git
Familiarity with continuous integration, Jenkins
Responsibilities
Architect, Design and Implement Large scale data processing pipelines
Design and Implement APIs
Assist in dev ops operations
Identify performance bottlenecks and bugs, and devise solutions to these problems
Help maintain code quality, organization, and documentation
Communicate with stakeholders regarding various aspects of solution.
Mentor team members on best practices
Role- Full time
Experience Level- 8 to 13 Years
Job Location- Hyderabad
Key Responsibilities :
Serves as a technical point of contact within the organization by:
Influencing the product requirements, behaviour and design (Automation Platform)
Driving early adoption of technology, features and best practices around product development
Lead development at all layers GUI, Backend ( DevOps Tools API integration) & DB
Work with a team of developers and testers in a highly agile environment to produce high-quality software.
Design and developing house tools. Also, expected to demonstrate new ideas through prototypes/Proof of Concepts.
Evaluate and Assess newer technologies/architecture for product development
Keeping up to date with emerging technologies/tools in DevOps Space and developments trends to assess the impact of the projects.
Must have:
Should possess Bachelors/Masters/ PhD in computer science with a minimum of 8+ years of experience
Should possess a minimum of 3 years of experience in Products/Tools Development
Should possess expertise in using various DevOps tools libraries and API's (Jenkins/JIRA/AWX/Nexus/GitHub/BitBucket/ SonarQube)
Experience in designing and developing products, tools or test automation frameworks using Java or Python technologies.
Should have a strong understanding of OOPs, SDLC (Agile Safe standards), STLC
Proficient in Python, with a good knowledge of its ecosystems (IDEs and Frameworks)
Familiar with designing and developing applications using AngularJS, HTML5, Bootstrap, NodeJS, MongoDB, etc.
Experience in implementing, consuming and testing Web services Rest APIs would be an added advantage.
Experience working as a Full-Stack developer would be an added advantage
Regards,
Talent Acquisition Team




