11+ 3D computer graphics Jobs in Hyderabad | 3D computer graphics Job openings in Hyderabad
Apply to 11+ 3D computer graphics Jobs in Hyderabad on CutShort.io. Explore the latest 3D computer graphics Job opportunities across top companies like Google, Amazon & Adobe.
Designation: Graphics and Simulation Engineer
Experience: 3-15 Yrs
Position Type: Full Time
Position Location: Hyderabad
Description:
We are looking for engineers to work on applied research problems related to computer graphics in autonomous driving of electric tractors. The team works towards creating a universe of farm environments in which tractors can driver around for the purposes of simulation, synthetic data generation for deep learning training, simulation of edges cases and modelling physics.
Technical Skills:
● Background in OpenGL, OpenCL, graphics algorithms and optimization is necessary.
● Solid theoretical background in computational geometry and computer graphics is desired. Deep learning background is optional.
● Experience in two view and multi-view geometry.
● Necessary Skills: Python, C++, Boost, OpenGL, OpenCL, Unity3D/Unreal, WebGL, CUDA.
● Academic experience for freshers in graphics is also preferred.
● Experienced candidates in Computer Graphics with no prior Deep Learning experience willing to apply their knowledge to vision problems are also encouraged to apply.
● Software development experience on low-power embedded platforms is a plus.
Responsibilities:
● Understanding of engineering principles and a clear understanding of data structures and algorithms.
● Ability to understand, optimize and debug imaging algorithms.
● Ability to drive a project from conception to completion, research papers to code with disciplined approach to software development on Linux platform
● Demonstrate outstanding ability to perform innovative and significant research in the form of technical papers, thesis, or patents.
● Optimize runtime performance of designed models.
● Deploy models to production and monitor performance and debug inaccuracies and exceptions.
● Communicate and collaborate with team members in India and abroad for the fulfillment of your duties and organizational objectives.
● Thrive in a fast-paced environment and have the ability to own the project end to end with minimum hand holding
● Learn & adapt new technologies & skillsets
● Work on projects independently with timely delivery & defect free approach.
● Thesis focusing on the above skill set may be given more preference.
Job description
Title-Full stack Developers
Job type-Full Time
Skills - Java , Spring Boot , SQL Server , PostgreSQL , MangoDB , Angular Js , Type Script , Microservice Architecture , Kafka , GIT , Git Flow Development , AWS , Azure , APIs , Web Services , CI/CD Pipeline , Agile/SCRUM , DataDog
Location- Hyderabad - Telangana
Experience- 6 to 9 Years
Annual CTC INR : 23 LPA - 28 LPA
Dead Line - 07/04/2025
Job Description
Position: Fullstack Developer
Location: Hyderabad, India
Employment Type: Full-Time
Open Positions: 7
Role Overview:
We are seeking experienced Fullstack Developers with 6-9 years of expertise in designing, developing, and maintaining scalable, distributed applications. The ideal candidate will be proficient in Java, Spring Boot, AngularJS, and TypeScript, with strong experience in cloud-based development, CI/CD pipelines, and Agile methodologies.
Key Responsibilities:
- Troubleshoot and resolve complex data, system, and software issues in production.
- Develop emergency bug fixes and manage production applications.
- Ensure production issues are resolved within SLA timelines.
- Deploy application changes using CI/CD pipelines.
- Review and manage production changes using ServiceNow.
- Lead scrum teams in Agile environments, ensuring high-quality technology solutions.
- Develop and enhance application frameworks with a focus on performance and scalability.
- Implement unit tests, container build checks, and API tests to support shift-left practices.
- Evaluate new platforms, tools, and technologies to optimize development workflows.
- Provide technical guidance, code reviews, and mentorship to team members.
Required Technical Skills:
- Strong experience in Java and Spring Boot application development.
- Proficiency in RDBMS (SQL Server/PostgreSQL) and NoSQL (MongoDB/ElasticSearch).
- Hands-on experience with AngularJS, TypeScript, and event-driven architecture.
- Solid understanding of messaging queues like Kafka.
- Expertise in Git and Git flow for code lifecycle management.
- Cloud experience with AWS or Azure, including API and web service development.
- Hands-on experience with CI/CD deployment pipelines and DevOps tools.
- Familiarity with monitoring tools such as Datadog, Dynatrace.
Nice to Have:
- Experience with Azure DevOps, SonarQube, and monitoring tools like StatsD.
- Test automation expertise.
Soft Skills & Leadership:
- Excellent problem-solving and analytical abilities.
- Strong communication and stakeholder management skills.
- Ability to lead Agile teams and drive best development practices.
Additional Requirements:
- Must be available to join within 3 weeks.
- Willing to attend face-to-face interviews as per company requirements.
- Open to relocating to Hyderabad if not already based there.
Role - Performance tester
Location – Hyderabad
Budget – only above 6.5 years (at least 5 years on load runner / strong in load runner) 6.5 to 10 years
Compensation - between 13 to 17 LPA
We need associates to be good in Load Runner both. Below is JD.
- Ability to work independently on project performance testing with minimal supervision.
- should have Performance Testing strong skills with working experience on API testing and Web Applications.
- Understanding of the requirements / architecture and NFR.
- Good experience in performance testing using MicroFocus – Performance Centre(Load Runner)
- Able to understand the complex architecture of the product developed in various technologies like java, springboot, PCF/AWS Cloud Technologies, etc..
- Hands-on experience of developing scripts, designing test scenarios, test execution, monitoring, analyzing results
- Participate in the development and reporting of test metrics; items such as test confidence and test coverage reports.
- Experience in setting up/designing load scenario for various tests like load test, endurance test, stress test, capacity measurements etc.
- Experience in profiling and monitoring tools like AppDynamics, Grafana, Kibana
- Monitor application logs to determine system behaviour. Address all technical issues;
- facilitate the resolution and necessary follow-up with Development and other cross- functional departments.
- Analyse the CPU Utilization, Memory usage, Garbage Collection, and DB Monitoring - Parameters and DB Reports to verify the performance of the applications
- Experience in Setting up Test Environments, Debugging Environment issues.
- Experience in Identifying H/W and S/W Needs for setting up Test Environment Knowledge of sound development practices with exposure to Agile development, release engineering and continuous integration is a plus.
Development and Customization:
Build and customize Frappe modules to meet business requirements.
Develop new functionalities and troubleshoot issues in ERPNext applications.
Integrate third-party APIs for seamless interoperability.
Technical Support:
Provide technical support to end-users and resolve system issues.
Maintain technical documentation for implementations.
Collaboration:
Work with teams to gather requirements and recommend solutions.
Participate in code reviews for quality standards.
Continuous Improvement:
Stay updated with Frappe developments and optimize application performance.
Skills Required:
Proficiency in Python, JavaScript, and relational databases.
Knowledge of Frappe/ERPNext framework and object-oriented programming.
Experience with Git for version control.
Strong analytical skill
We are looking for a skilled and passionate Data Engineers with a strong foundation in Python programming and hands-on experience working with APIs, AWS cloud, and modern development practices. The ideal candidate will have a keen interest in building scalable backend systems and working with big data tools like PySpark.
Key Responsibilities:
- Write clean, scalable, and efficient Python code.
- Work with Python frameworks such as PySpark for data processing.
- Design, develop, update, and maintain APIs (RESTful).
- Deploy and manage code using GitHub CI/CD pipelines.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Work on AWS cloud services for application deployment and infrastructure.
- Basic database design and interaction with MySQL or DynamoDB.
- Debugging and troubleshooting application issues and performance bottlenecks.
Required Skills & Qualifications:
- 4+ years of hands-on experience with Python development.
- Proficient in Python basics with a strong problem-solving approach.
- Experience with AWS Cloud services (EC2, Lambda, S3, etc.).
- Good understanding of API development and integration.
- Knowledge of GitHub and CI/CD workflows.
- Experience in working with PySpark or similar big data frameworks.
- Basic knowledge of MySQL or DynamoDB.
- Excellent communication skills and a team-oriented mindset.
Nice to Have:
- Experience in containerization (Docker/Kubernetes).
- Familiarity with Agile/Scrum methodologies.
About Company:
The company is a global leader in secure payments and trusted transactions. They are at the forefront of the digital revolution that is shaping new ways of paying, living, doing business and building relationships that pass on trust along the entire payments value chain, enabling sustainable economic growth. Their innovative solutions, rooted in a rock-solid technological base, are environmentally friendly, widely accessible and support social transformation.
- Role Overview
- Senior Engineer with a strong background and experience in cloud related technologies and architectures. Can design target cloud architectures to transform existing architectures together with the in-house team. Can actively hands-on configure and build cloud architectures and guide others.
- Key Knowledge
- 3-5+ years of experience in AWS/GCP or Azure technologies
- Is likely certified on one or more of the major cloud platforms
- Strong experience from hands-on work with technologies such as Terraform, K8S, Docker and orchestration of containers.
- Ability to guide and lead internal agile teams on cloud technology
- Background from the financial services industry or similar critical operational experience
We are seeking a Senior Data Scientist with hands-on experience in Generative AI (GenAI) and Large Language Models (LLM). The ideal candidate will have expertise in building, fine-tuning, and deploying LLMs, as well as managing the lifecycle of AI models through LLMOps practices. You will play a key role in driving AI innovation, developing advanced algorithms, and optimizing model performance for various business applications.
Key Responsibilities:
- Develop, fine-tune, and deploy Large Language Models (LLM) for various business use cases.
- Implement and manage the operationalization of LLMs using LLMOps best practices.
- Collaborate with cross-functional teams to integrate AI models into production environments.
- Optimize and troubleshoot model performance to ensure high accuracy and scalability.
- Stay updated with the latest advancements in Generative AI and LLM technologies.
Required Skills and Qualifications:
- Strong hands-on experience with Generative AI, LLMs, and NLP techniques.
- Proven expertise in LLMOps, including model deployment, monitoring, and maintenance.
- Proficiency in programming languages like Python and frameworks such as TensorFlow, PyTorch, or Hugging Face.
- Solid understanding of AI/ML algorithms and model optimization.
JD for SharePoint and Power Apps Developer Priority
Job Summary: We are looking for an experienced SharePoint and Power Apps Developer to join our team. The successful candidate will be responsible for designing, developing, testing, and deploying custom solutions on SharePoint and Power Apps platforms. The ideal candidate should have a strong background in SharePoint and Power Apps development, as well as experience with Microsoft Power Platform technologies.
Responsibilities:
• Design, develop, test, and deploy custom solutions on SharePoint and Power Apps platforms • Develop and implement custom workflows and forms using Microsoft Power Platform technologies • Develop and maintain SharePoint and Power Apps solutions using best practices • Create and maintain technical documentation and user manuals • Work closely with stakeholders to identify requirements and develop solutions that meet business needs • Collaborate with other developers and team members to ensure timely delivery of high-quality solutions • Stay up-to-date with the latest SharePoint and Power Apps development trends and technologies
Requirements:
• Bachelor’s degree in Computer Science, Information Systems, or related field
• Minimum of 5 years of experience in SharePoint and Power Apps development
• Strong experience with Microsoft Power Platform technologies such as Power Automate, Power BI, and Power Virtual Agents
• Strong experience with SharePoint development, including SharePoint Framework (SPFx), SharePoint Designer, and SharePoint Online
• Experience with SharePoint migration and administration is a plus
• Knowledge of HTML, CSS, JavaScript, and TypeScript
• Strong analytical and problem-solving skills
• Ability to work independently and as part of a team
• Excellent communication and interpersonal skills
• Ability to multitask and manage.
Job Type: Full-time
Experience- 5 yrs (not below)
Location: Hyderabad (WFO)
We are actively seeking a self-motivated Data Engineer with expertise in Azure cloud and Databricks, with a thorough understanding of Delta Lake and Lake-house Architecture. The ideal candidate should excel in developing scalable data solutions, crafting platform tools, and integrating systems, while demonstrating proficiency in cloud-native database solutions and distributed data processing.
Key Responsibilities:
- Contribute to the development and upkeep of a scalable data platform, incorporating tools and frameworks that leverage Azure and Databricks capabilities.
- Exhibit proficiency in various RDBMS databases such as MySQL and SQL-Server, emphasizing their integration in applications and pipeline development.
- Design and maintain high-caliber code, including data pipelines and applications, utilizing Python, Scala, and PHP.
- Implement effective data processing solutions via Apache Spark, optimizing Spark applications for large-scale data handling.
- Optimize data storage using formats like Parquet and Delta Lake to ensure efficient data accessibility and reliable performance.
- Demonstrate understanding of Hive Metastore, Unity Catalog Metastore, and the operational dynamics of external tables.
- Collaborate with diverse teams to convert business requirements into precise technical specifications.
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or a related discipline.
- Demonstrated hands-on experience with Azure cloud services and Databricks.
- Proficient programming skills in Python, Scala, and PHP.
- In-depth knowledge of SQL, NoSQL databases, and data warehousing principles.
- Familiarity with distributed data processing and external table management.
- Insight into enterprise data solutions for PIM, CDP, MDM, and ERP applications.
- Exceptional problem-solving acumen and meticulous attention to detail.
Additional Qualifications :
- Acquaintance with data security and privacy standards.
- Experience in CI/CD pipelines and version control systems, notably Git.
- Familiarity with Agile methodologies and DevOps practices.
- Competence in technical writing for comprehensive documentation.
Role & Responsibilities
- Create innovative architectures based on business requirements.
- Design and develop cloud-based solutions for global enterprises.
- Coach and nurture engineering teams through feedback, design reviews, and best practice input.
- Lead cross-team projects, ensuring resolution of technical blockers.
- Collaborate with internal engineering teams, global technology firms, and the open-source community.
- Lead initiatives to learn and apply modern and advanced technologies.
- Oversee the launch of innovative products in high-volume production environments.
- Develop and maintain high-quality software applications using JS frameworks (React, NPM, Node.js etc.,).
- Utilize design patterns for backend technologies and ensure strong coding skills.
- Deploy and manage applications on AWS cloud services, including ECS (Fargate), Lambda, and load balancers. Work with Docker to containerize services.
- Implement and follow CI/CD practices using GitLab for automated build, test, and deployment processes.
- Collaborate with cross-functional teams to design technical solutions, ensuring adherence to Microservice Design patterns and Architecture.
- Apply expertise in Authentication & Authorization protocols (e.g., JWT, OAuth), including certificate handling, to ensure robust application security.
- Utilize databases such as Postgres, MySQL, Mongo and DynamoDB for efficient data storage and retrieval.
- Demonstrate familiarity with Big Data technologies, including but not limited to:
- Apache Kafka for distributed event streaming.
- Apache Spark for large-scale data processing.
- Containers for scalable and portable deployments.
Technical Skills:
- 7+ years of hands-on development experience with JS frameworks, specifically MERN.
- Strong coding skills in backend technologies using various design patterns.
- Strong UI development skills using React.
- Expert in containerization using Docker.
- Knowledge of cloud platforms, specifically OCI, and familiarity with serverless technology, services like ECS, Lambda, and load balancers.
- Proficiency in CI/CD practices using GitLab or Bamboo.
- Strong knowledge of Microservice Design patterns and Architecture.
- Expertise in Authentication and authorization protocols like JWT, and OAuth including certificate handling.
- Experience working with high stream media data.
- Experience working with databases such as Postgres, MySQL, and DynamoDB.
- Familiarity with Big Data technologies related to Kafka, PySpark, Apache Spark, Containers, etc.
- Experience with container Orchestration tools like Kubernetes.
https://www.linkedin.com/feed/hashtag/?keywords=role&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7016738698557890560">#Role: AUTOSAR Developer
https://www.linkedin.com/feed/hashtag/?keywords=designation&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7016738698557890560">#Designation: Specialist, Technical lead, Architect.
https://www.linkedin.com/feed/hashtag/?keywords=location&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7016738698557890560">#Location : Hyderabad
https://www.linkedin.com/feed/hashtag/?keywords=position&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7016738698557890560">#Position: Fulltime
https://www.linkedin.com/feed/hashtag/?keywords=skills&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7016738698557890560">#Skills:
Proficient in development of Base Software in AUTOSAR.
Experience in configuration of different AUTOSAR BSW components (Comms)
Experience of working in any of the commercial AUTOSAR stack’s (EB/Vector/KPIT/Mentor/Arccore etc.)
Very good problem resolution and debugging skills.
Very good communication and cross cultural skills.
Experience of working in Vector AUTOSAR Stack and related tool chain.
Experience of working in RTE and APSW layers.
Working Knowledge in IS026262.
Experience in latest uc platforms (Infineon/Renesas/TI/NXP.
Proficient in development of Base Software in AUTOSAR.
Experience in configuration of different AUTOSAR BSW components (Comms/DCM/DEM/Memory etc.)
Experience of working in any of the commercial AUTOSAR
Experience of working in Vector AUTOSAR Stack and related tool chain.
Experience of working in RTE and APSW layers.
Working Knowledge in IS026262.
Experience in latest uc platforms (Infineon/Renesas/TI/NXP)
https://www.linkedin.com/feed/hashtag/?keywords=job&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7016738698557890560">#Job R&R
Mentor/Lead a team of around 2-3engineers.
Work in SW development of ADAS Products.
--
Regards
Sumani P
http://www.qcentrio.com/">www.qcentrio.com






