11+ Pyramid Jobs in Hyderabad | Pyramid Job openings in Hyderabad
Apply to 11+ Pyramid Jobs in Hyderabad on CutShort.io. Explore the latest Pyramid Job opportunities across top companies like Google, Amazon & Adobe.

A American Bank holding company . a community-focused financial institution that provides accessible banking services to its members, operating on a not-for-profit basis.
Position: AIML_Python Enginner
Kothapet_Hyderabad _Hybrid.( 4 days a week onsite)
Contract to hire fulltime to client.
5+ years of python experience for scripting ML workflows to deploy ML Pipelines as real time, batch, event triggered, edge deployment
4+ years of experience in using AWS sagemaker for deployment of ML pipelines and ML Models using Sagemaker piplines, Sagemaker mlflow, Sagemaker Feature Store..etc.
3+ years of development of apis using FastAPI, Flask, Django
3+ year of experience in ML frameworks & tools like scikit-learn, PyTorch, xgboost, lightgbm, mlflow.
Solid understanding of ML lifecycle: model development, training, validation, deployment and monitoring
Solid understanding of CI/CD pipelines specifically for ML workflows using bitbucket, Jenkins, Nexus, AUTOSYS for scheduling
Experience with ETL process for ML pipelines with PySpark, Kafka, AWS EMR Serverless
Good to have experience in H2O.ai
Good to have experience in containerization using Docker and Orchestration using Kubernetes.
Role Overview
Join our core tech team to build the intelligence layer of Clink's platform. You'll architect AI agents, design prompts, build ML models, and create systems powering personalized offers for thousands of restaurants. High-growth opportunity working directly with founders, owning critical features from day one.
Why Clink?
Clink revolutionizes restaurant loyalty using AI-powered offer generation and customer analytics:
- ML-driven customer behavior analysis (Pattern detection)
- Personalized offers via LLMs and custom AI agents
- ROI prediction and forecasting models
- Instagram marketing rewards integration
Tech Stack:
- Python,
- FastAPI,
- PostgreSQL,
- Redis,
- Docker,
- LLMs
You Will Work On:
AI Agents: Design and optimize AI agents
ML Models: Build redemption prediction, customer segmentation, ROI forecasting
Data & Analytics: Analyze data, build behavior pattern pipelines, create product bundling matrices
System Design: Architect scalable async AI pipelines, design feedback loops, implement A/B testing
Experimentation: Test different LLM approaches, explore hybrid LLM+ML architectures, prototype new capabilities
Must-Have Skills
Technical: 0-2 years AI/ML experience (projects/internships count), strong Python, LLM API knowledge, ML fundamentals (supervised learning, clustering), Pandas/NumPy proficiency
Mindset: Extreme curiosity, logical problem-solving, builder mentality (side projects/hackathons), ownership mindset
Nice to Have: Pydantic, FastAPI, statistical forecasting, PostgreSQL/SQL, scikit-learn, food-tech/loyalty domain interest
CTC: up to 20 LPA
Exp: 4 to 7 Years
Required Qualifications
- Bachelor's degree in Computer Science, Information Technology, or related field
- 4+ years of experience in software development
- Strong proficiency in Java with deep understanding of web technology stack
- Hands-on experience developing applications with Spring Boot framework
- Solid understanding of Python programming language with practical Flask framework experience
- Working knowledge of NATS server for messaging and streaming data
- Experience deploying and managing applications in Kubernetes
- Understanding of microservices architecture and RESTful API design
- Familiarity with containerization technologies (Docker)
- Experience with version control systems (Git)
Skills & Competencies
- Skills Java (Spring Boot, Spring Cloud, Spring Security)
- Python (Flask, SQL Alchemy, REST APIs)
- NATS messaging patterns (pub/sub, request/reply, queue groups)
- Kubernetes (deployments, services, ingress, ConfigMaps, Secrets)
- Web technologies (HTTP, REST, WebSocket, gRPC)
- Container orchestration and management
- Soft Skills Problem-solving and analytical thinking
Job Summary:
Deqode is looking for a highly motivated and experienced Python + AWS Developer to join our growing technology team. This role demands hands-on experience in backend development, cloud infrastructure (AWS), containerization, automation, and client communication. The ideal candidate should be a self-starter with a strong technical foundation and a passion for delivering high-quality, scalable solutions in a client-facing environment.
Key Responsibilities:
- Design, develop, and deploy backend services and APIs using Python.
- Build and maintain scalable infrastructure on AWS (EC2, S3, Lambda, RDS, etc.).
- Automate deployments and infrastructure with Terraform and Jenkins/GitHub Actions.
- Implement containerized environments using Docker and manage orchestration via Kubernetes.
- Write automation and scripting solutions in Bash/Shell to streamline operations.
- Work with relational databases like MySQL and SQL, including query optimization.
- Collaborate directly with clients to understand requirements and provide technical solutions.
- Ensure system reliability, performance, and scalability across environments.
Required Skills:
- 3.5+ years of hands-on experience in Python development.
- Strong expertise in AWS services such as EC2, Lambda, S3, RDS, IAM, CloudWatch.
- Good understanding of Terraform or other Infrastructure as Code tools.
- Proficient with Docker and container orchestration using Kubernetes.
- Experience with CI/CD tools like Jenkins or GitHub Actions.
- Strong command of SQL/MySQL and scripting with Bash/Shell.
- Experience working with external clients or in client-facing roles.
Preferred Qualifications:
- AWS Certification (e.g., AWS Certified Developer or DevOps Engineer).
- Familiarity with Agile/Scrum methodologies.
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder management abilities.
Job Title: Python Developer (FastAPI)
Experience Required: 4+ years
Location: Pune, Bangalore, Hyderabad, Mumbai, Panchkula, Mohali
Shift: Night Shift 6:30 pm to 3:30 AM IST
About the Role
We are seeking an experienced Python Developer with strong expertise in FastAPI to join our engineering team. The ideal candidate should have a solid background in backend development, RESTful API design, and scalable application development.
Required Skills & Qualifications
· 4+ years of professional experience in backend development with Python.
· Strong hands-on experience with FastAPI (or Flask/Django with migration experience).
· Familiarity with asynchronous programming in Python.
· Working knowledge of version control systems (Git).
· Good problem-solving and debugging skills.
· Strong communication and collaboration abilities.
We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.
Mon-Fri, In office role with excellent perks and benefits!
Key Responsibilities:
1. Design, develop, and maintain backend services and APIs using Node.js or Python, or Java.
2. Build and implement scalable and robust microservices and integrate API gateways.
3. Develop and optimize NoSQL database structures and queries (e.g., MongoDB, DynamoDB).
4. Implement real-time data pipelines using Kafka.
5. Collaborate with front-end developers to ensure seamless integration of backend services.
6. Write clean, reusable, and efficient code following best practices, including design patterns.
7. Troubleshoot, debug, and enhance existing systems for improved performance.
Mandatory Skills:
1. Proficiency in at least one backend technology: Node.js or Python, or Java.
2. Strong experience in:
i. Microservices architecture,
ii. API gateways,
iii. NoSQL databases (e.g., MongoDB, DynamoDB),
iv. Kafka
v. Data structures (e.g., arrays, linked lists, trees).
3. Frameworks:
i. If Java : Spring framework for backend development.
ii. If Python: FastAPI/Django frameworks for AI applications.
iii. If Node: Express.js for Node.js development.
Good to Have Skills:
1. Experience with Kubernetes for container orchestration.
2. Familiarity with in-memory databases like Redis or Memcached.
3. Frontend skills: Basic knowledge of HTML, CSS, JavaScript, or frameworks like React.js.
Role Overview:
The AI Research Intern will focus on natural language processing (NLP) and working with large language models (LLMs). They will assist in refining and testing the retrieval-augmented generation (RAG) system for CopilotGTM.
Key Responsibilities:
- Assist in developing and refining NLP models to answer customer queries.
- Research and implement improvements to minimize hallucinations in the LLM.
- Test RAG model configurations and provide feedback to improve accuracy.
- Work with the engineering team to improve real-time product recommendations and responses.
- Analyze datasets and fine-tune models for specific use cases (e.g., sales, compliance).
Skills Required:
- Strong understanding of NLP and familiarity with LLMs (GPT, BERT, etc.).
- Basic coding experience in Python.
- Knowledge of data handling, data processing, and model training.
- Problem-solving ability and eagerness to experiment with new techniques.
Preferred:
- Experience with libraries like Hugging Face, PyTorch, or TensorFlow.
- Familiarity with retrieval-augmented generation (RAG) systems.
Job Description :-
- Have intermediate/advanced knowledge of Python.
- Hands-on experience with OOP in Python. Flask/Django framework, ORM with MySQL, MongoDB is a plus.
- Must have experience of writing shell scripts and configuration files. Should be proficient in bash.
- Should have excellent Linux administration capabilities.
- Working experience of SCM. Git is preferred.
- Should have knowledge of the basics of networking in Linux, and computer networks in general.
- Experience in engineering practices such as code refactoring, design patterns, design driven development, Continuous Integration.
- Understanding of Architecture of OpenStack/Kubernetes and good knowledge of standard client interfaces is a plus.
- Code contributed to OpenStack/Kubernetes community will be plus.
- Understanding of NFV and SDN domain will be plus.
Software Development Engineer – SDE 2.
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.
Top Skills
You write high quality, maintainable, and robust code, often in Java or C++ or C#
You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities
You solve problems at their root, stepping back to understand the broader context.
You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
You recognize and use design patterns to solve business problems.
You understand how operating systems work, perform and scale.
You continually align your work with Amazon’s business objectives and seek to deliver business value.
You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
You communicate clearly with your team and with other groups and listen effectively.
Skills & Experience
Bachelors or Masters in Computer Science or relevant technical field.
Experience in software development and full product life-cycle.
Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
Proficiency in SQL and data modeling.






