11+ High Performance Computing (HPC) Jobs in Hyderabad | High Performance Computing (HPC) Job openings in Hyderabad
Apply to 11+ High Performance Computing (HPC) Jobs in Hyderabad on CutShort.io. Explore the latest High Performance Computing (HPC) Job opportunities across top companies like Google, Amazon & Adobe.
As a Lead Solutions Architect at Aganitha, you will:
* Engage and co-innovate with customers in BioPharma R&D
* Design and oversee implementation of solutions for BioPharma R&D * Manage Engineering teams using Agile methodologies
* Enhance reuse with platforms, frameworks and libraries
Applying candidates must have demonstrated expertise in the following areas:
1. App dev with modern tech stacks of Python, ReactJS, and fit for purpose database technologies
2. Big data engineering with distributed computing frameworks
3. Data modeling in scientific domains, preferably in one or more of: Genomics, Proteomics, Antibody engineering, Biological/Chemical synthesis and formulation, Clinical trials management
4. Cloud and DevOps automation
5. Machine learning and AI (Deep learning)
- 2+ years experience working with Oracle Commerce Cloud application
- 4-6 years of overall experience in web application.
Must have strong knowledge:
- Extensive knowledge of Oracle Commerce Cloud
- Excellent knowledge of server-side JAVA for enterprise level applications (J2EE)
- Experience in Javascript (Knockout preferred), Bootstrap, JQUERY, node.js, HTML 5, CSS
- Proficiency in SQL Server, MySQL, and relational data models in context of web integration
- Proficiency with web services (WCF, Web API, SOAP, REST, SOA)
- Excellent Communication Skills
Good to have knowledge on:
- Understanding of Oracle cloud technologies including technical
- Experience in working in an Agile environment
Working Hours: 1:00PM to 10:00PM
Your job as a business development manager is to identify sales leads, pitch our product to new clients and onboard them, and maintain a good working relationship with new contacts/schools. Communicating new product developments to prospective clients.
- Meeting schools and onboarding them
- Creating and managing top level sales funnel and generate leads
- Organize and coordinate operations in ways that ensure maximum productivity
- Managing customer and channel affiliated relationship with client
- Conducting demos and presentation of the product
- Monitor sales, marketing & operational activities; implement strategies to maximise channel sales & collections as well as smooth operations.
- Review your own sales performance, aim to meet or exceed targets
- Gather market and customer information and provide feedback on the same
Why you?
- 5 + years experience in a sales role with schools, preferably in edtech and having a client base ( having direct relationship with the school management) of around 100-150 schools in their region.
- B.tech/MBA degree.
- Detailed understanding of the principles of sales
- Strong communication skills
- Proven track record in achieving sales quotas
- Open to travelling.
Perks:
Apart from very competitive compensation, you will:
- Work in a high paced growth environment
- Work with a very experienced team
Must have attributes:
- Extreme sense of ownership
- Bias for action
- Ability to change the preempt and iterate the course of action to achieve targets.
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Responsibilities -
- Collaborate with the development team to understand data requirements and identify potential scalability issues.
- Design, develop, and implement scalable data pipelines and ETL processes to ingest, process, and analyse large - volumes of data from various sources.
- Optimize data models and database schemas to improve query performance and reduce latency.
- Monitor and troubleshoot the performance of our Cassandra database on Azure Cosmos DB, identifying bottlenecks and implementing optimizations as needed.
- Work with cross-functional teams to ensure data quality, integrity, and security.
- Stay up to date with emerging technologies and best practices in data engineering and distributed systems.
Qualifications & Requirements -
- Proven experience as a Data Engineer or similar role, with a focus on designing and optimizing large-scale data systems.
- Strong proficiency in working with NoSQL databases, particularly Cassandra.
- Experience with cloud-based data platforms, preferably Azure Cosmos DB.
- Solid understanding of Distributed Systems, Data modelling, Data Warehouse Designing, and ETL Processes.
- Detailed understanding of Software Development Life Cycle (SDLC) is required.
- Good to have knowledge on any visualization tool like Power BI, Tableau.
- Good to have knowledge on SAP landscape (SAP ECC, SLT, BW, HANA etc).
- Good to have experience on Data Migration Project.
- Knowledge of Supply Chain domain would be a plus.
- Familiarity with software architecture (data structures, data schemas, etc.)
- Familiarity with Python programming language is a plus.
- The ability to work in a dynamic, fast-paced, work environment.
- A passion for data and information with strong analytical, problem solving, and organizational skills.
- Self-motivated with the ability to work under minimal direction.
- Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.
- Database Integration
- Java
- Oops concept
- Spring boot
- No SQL server (Mongo DB)
- SQL Server
- Rest API
- MVC
- Hibernate
- API optimization
- Multi-Processing
Desired Competencies (Technical/Behavioral Competency)-Must-Have
- 2+ years of relevant work experience in Java & Spring
- Experience in Mongo DB (optional).
- Experience working in software development
- Proficient in Sprint Boot
- Good understanding of Java language
- Knowledge of NoSQL & SQL databases
- Has experience in functional and technical design
- Experience in working with front end technologies and/or front end
Good-to-Have
- Experience with user-interface
- Background in computer science, with a focus on data structures, algorithms, and API
- Ability to learn other coding
- Demonstrated ability to share knowledge via formal mentoring, reviewing code, reviewing design documents, providing technical talks, teaching classes, or as a consultant on
Mandatory skills:
- Programming skills in Python, Robot framework, Selenium, Shell scripting
- Experience on L2/L3 protocols of VLAN/DHCP/LACP/IGMP/PPPoE.
- Should be familiar with device configuration protocols of CLI/NETCONF/SNMP.
- Experience in telecom technologies like DSLAM/GPON/G.fast/Next gen broadband technologies is highly recommended
- Knowledge on regression/performance/load/scale/stability test areas
- Hands on experience with Common industry equipment like Spirent test center/ixia/Abacus/Shenick(TeraVM)/N2X Traffic generators.
- Exposure to debug tools such as Wireshark/tcpdump.
Job Requirements:
- Knowledge on software Test cycle, test plan and test case creation
- Understanding of End to end test setup topology and debugging.
- Ability to perform System level Functional and Non-functional tests.
- Familiar with Manual test life cycle
- Designing and writing test automation scripts using automation frameworks
- Exposure to CI/CD pipe line implementation and maintenance using Jenkins, Groovy scripting
- Linux skills (system configuration and administration, containers, networking experience and d) such as sockets and database management
- Good debugging skills and knowledge on various debug tools
- Bug reporting/tracking and providing logs.
Responsibilities for Staff Engineer role:
- Having experience in Java/Python/Golang along with Springboot, Micorservices, RDBMS
- Experience required: 10yrs to 15 yrs.



