11+ NOC Jobs in Hyderabad | NOC Job openings in Hyderabad
Apply to 11+ NOC Jobs in Hyderabad on CutShort.io. Explore the latest NOC Job opportunities across top companies like Google, Amazon & Adobe.
We are seeking a high-caliber Firmware Lead to join our Engineering team at Gradera. In this role, you will be the technical anchor for the firmware squad, responsible for translating high-level architectural visions into robust, executable low-level designs (LLD). You will lead the design and development of firmware solutions on NXP-based hardware platforms, ensuring seamless real-time data acquisition and integration with cloud-based Machine Learning (ML) platforms. We are looking for a seasoned expert who can work independently without any supervision, taking full ownership of the firmware lifecycle from hardware abstraction to cloud-edge synchronization.
Our Core Tech Stack
Embedded & OS
- NXP SoCs/MCUs: i.MX, LPC, and Kinetis series.
- Yocto Project: Custom layers, recipes, BitBake, and kernel configuration for Linux.
- RTOS Platforms: Deterministic performance, task scheduling, and interrupt handling.
Development & Integration
- Languages: Mandatory proficiency in C/C++ and C# (.NET on embedded targets/IoT).
- Communication: MQTT, WebSockets, CAN, UART, SPI, and I2C.
- Cloud & ML: Azure IoT Hub, AWS IoT Core, and data streaming via Kafka or Kinesis.
Infrastructure & Security
- Security: Secure boot, encryption, and device authentication.
- DevOps: Containerization (Docker) and CI/CD for firmware.
Key Responsibilities
- Architectural Ownership: Convert high-level blueprints into detailed technical designs for NXP-based systems, ensuring optimal performance across hardware and software layers.
- Autonomous Execution: Lead the end-to-end development of firmware modules, making critical technical decisions and resolving complex blockers without supervision.
- ML Pipeline Leadership: Collaborate with Data Engineering and ML teams to architect streaming and batch ingestion pipelines, ensuring data is correctly structured for ML training.
- Cloud-Edge Synchronization: Design secure and reliable transmission protocols for device-to-cloud communication, focusing on edge-to-cloud integration.
- Standards Enforcement: Act as the guardian of engineering excellence, implementing security best practices (secure boot, TLS) and ensuring high code quality.
- Technical Mentoring: Act as a technical beacon for the squad, conducting rigorous code reviews and mentoring senior engineers in Yocto Linux and RTOS concepts.
- Strategic Troubleshooting: Lead the debugging of critical firmware issues across hardware and software layers, including OTA update implementations.
Preferred Qualifications
- 8 to 10 years of professional experience in embedded firmware development.
- Proven ability to work independently and lead technical squads in a fast-paced environment.
- Expert-level mastery of the Yocto Project and RTOS constraints.
- Deep proficiency in C/C++ and C# for embedded systems.
- Demonstrated track record of delivering low-level designs for edge-to-cloud ML systems.
Highly Desirable
- Industry Experience: Exposure to industrial domains such as Manufacturing, Logistics, or Transportation is highly regarded.
- Experience with Edge AI / TinyML and industrial protocols (Modbus, OPC-UA).
- Knowledge of Cybersecurity standards for secure device provisioning.

Global Digital Transformation Solutions Provider
MUST-HAVES:
- LLM, AI, Prompt Engineering LLM Integration & Prompt Engineering
- Context & Knowledge Base Design.
- Context & Knowledge Base Design.
- Experience running LLM evals
NOTICE PERIOD: Immediate – 30 Days
SKILLS: LLM, AI, PROMPT ENGINEERING
NICE TO HAVES:
Data Literacy & Modelling Awareness Familiarity with Databricks, AWS, and ChatGPT Environments
ROLE PROFICIENCY:
Role Scope / Deliverables:
- Scope of Role Serve as the link between business intelligence, data engineering, and AI application teams, ensuring the Large Language Model (LLM) interacts effectively with the modeled dataset.
- Define and curate the context and knowledge base that enables GPT to provide accurate, relevant, and compliant business insights.
- Collaborate with Data Analysts and System SMEs to identify, structure, and tag data elements that feed the LLM environment.
- Design, test, and refine prompt strategies and context frameworks that align GPT outputs with business objectives.
- Conduct evaluation and performance testing (evals) to validate LLM responses for accuracy, completeness, and relevance.
- Partner with IT and governance stakeholders to ensure secure, ethical, and controlled AI behavior within enterprise boundaries.
KEY DELIVERABLES:
- LLM Interaction Design Framework: Documentation of how GPT connects to the modeled dataset, including context injection, prompt templates, and retrieval logic.
- Knowledge Base Configuration: Curated and structured domain knowledge to enable precise and useful GPT responses (e.g., commercial definitions, data context, business rules).
- Evaluation Scripts & Test Results: Defined eval sets, scoring criteria, and output analysis to measure GPT accuracy and quality over time.
- Prompt Library & Usage Guidelines: Standardized prompts and design patterns to ensure consistent business interactions and outcomes.
- AI Performance Dashboard / Reporting: Visualizations or reports summarizing GPT response quality, usage trends, and continuous improvement metrics.
- Governance & Compliance Documentation: Inputs to data security, bias prevention, and responsible AI practices in collaboration with IT and compliance teams.
KEY SKILLS:
Technical & Analytical Skills:
- LLM Integration & Prompt Engineering – Understanding of how GPT models interact with structured and unstructured data to generate business-relevant insights.
- Context & Knowledge Base Design – Skilled in curating, structuring, and managing contextual data to optimize GPT accuracy and reliability.
- Evaluation & Testing Methods – Experience running LLM evals, defining scoring criteria, and assessing model quality across use cases.
- Data Literacy & Modeling Awareness – Familiar with relational and analytical data models to ensure alignment between data structures and AI responses.
- Familiarity with Databricks, AWS, and ChatGPT Environments – Capable of working in cloud-based analytics and AI environments for development, testing, and deployment.
- Scripting & Query Skills (e.g., SQL, Python) – Ability to extract, transform, and validate data for model training and evaluation workflows.
- Business & Collaboration Skills Cross-Functional Collaboration – Works effectively with business, data, and IT teams to align GPT capabilities with business objectives.
- Analytical Thinking & Problem Solving – Evaluates LLM outputs critically, identifies improvement opportunities, and translates findings into actionable refinements.
- Commercial Context Awareness – Understands how sales and marketing intelligence data should be represented and leveraged by GPT.
- Governance & Responsible AI Mindset – Applies enterprise AI standards for data security, privacy, and ethical use.
- Communication & Documentation – Clearly articulates AI logic, context structures, and testing results for both technical and non-technical audiences.
Job Description for AEM Autodesk-
Responsibilities :
Work with product owners to form technical solutions that meet requirements
Collaborate with architects, designers, product team, other developers and teams, as well as participate in code reviews
Help design and implement components that integrate with internal and external systems
Take ownership of, be a subject matter expert and maintain complex modules
Specific, actionable and responsible project updates in an Agile environment
Engineer mobile and web experiences with exceptional performance
Requirements :
5+ years industry experience with AEM (Adobe Experience Manager)
1+ years of experience with Contentful
Understanding and experience developing web services using AWS
Understanding and experience of server-side web development using Java
Experience developing efficient and reliable JavaScript (ES6/2015), both client and server-side and Javascript testing frameworks such as
Jasmine, mocha.js, Jest
Experience creating maintainable CSS
Experience with Object-Oriented and Functional programming paradigms
Understand fundamental elements of good software architecture
Working understanding of the latest web APIs and standards (HTML 5, CSS 3, ECMAScript 2015)
Experience with continuous integration tools, such as Jenkins
3 or more years of relevant programming experience
2 or more years experience with Adobe AEM (CQ) and related technologies
1 or more years of experience working in a modern web application stack (React, Backbone, Angular, Ember, etc.)
Bachelor's degree in computer science, Software Engineering, or equivalent work experience
Proficiency in spoken and written English
Location: Pune or Hyderabad.
Notice period: 15 to 30 days
- Creating and managing ETL/ELT pipelines based on requirements
- Build PowerBI dashboards and manage datasets needed.
- Work with stakeholders to identify data structures needed for future and perform any transformations including aggregations.
- Build data cubes for real-time visualisation needs and CXO dashboards.
Required Tech Skills
- Microsoft PowerBI & DAX
- Python, Pandas, PyArrow, Jupyter Noteboks, ApacheSpark
- Azure Synapse, Azure DataBricks, Azure HDInsight, Azure Data Factory
Analysis/audits inbound/outbound calls, emails, and customer surveys to identify areas of service delivery that did not meet pre-established performance standards within the CS support /Sales teams
Provides structured and timely recommendations; verbal and/or written feedback to the advisors.
- You would be responsible for sales of residential properties.
- You would be required to follow all standard operating procedures for effective sales.
- You would need to attend all customer queries and would also need to ensure that all the customer queries and complaints are forwarded to the right individual, well in time to enable him/ her to address and resolve the issue at the earliest.
- You would have to collect & compile customer data on timely basis.
- You will need to ensure that all reports are duly completed in time with efficiency.
- You will also need to make corporate visits & presentations.
- You will need to participate in the survey conducted by the sales department with regards to the market & competitors.
- You would have to make every effort to maximize both present and long term sales & gross profits
- You will need to call and advise our prospective customers on a daily basis. These calls would be made using telephone, mobile handset or automatic dialer system provided by the company. By the means of these calls you will need to influence the potential customers to visit our site. Also, with proper follow ups one will need to ensure that these customers eventually buy their dream home from Pacifica.
1. Experience in JS Framework like Backbone
2. Good at Web Service integration with JSON
3. Project experience in MVC architecture, and Freemarker
4. Good at Intelli-J IDE with MAVEN build automation
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 7+ years
Location : Pune / Hyderabad
Skills :
- Drive and participate in requirements gathering workshops, estimation discussions, design meetings and status review meetings
- Participate and contribute in Solution Design and Solution Architecture for implementing Big Data Projects on-premise and on cloud
- Technical Hands on experience in design, coding, development and managing Large Hadoop implementation
- Proficient in SQL, Hive, PIG, Spark SQL, Shell Scripting, Kafka, Flume, Scoop with large Big Data and Data Warehousing projects with either Java, Python or Scala based Hadoop programming background
- Proficient with various development methodologies like waterfall, agile/scrum and iterative
- Good Interpersonal skills and excellent communication skills for US and UK based clients
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com



![[x]cube LABS](/_next/image?url=https%3A%2F%2Fcdnv2.cutshort.io%2Fcompany-static%2F639877aa0ad87e002533a1c5%2Fuser_uploaded_data%2Flogos%2Fx_whiteB_eeCk0gqs.png&w=256&q=75)

