11+ Database performance tuning Jobs in Hyderabad | Database performance tuning Job openings in Hyderabad
Apply to 11+ Database performance tuning Jobs in Hyderabad on CutShort.io. Explore the latest Database performance tuning Job opportunities across top companies like Google, Amazon & Adobe.
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Position Overview:
As a Shopify Developer, you will play a pivotal role in the development and customization of Shopify themes. Your expertise in HTML5, SASS/CSS3, and JavaScript, coupled with proficiency in VueJS or React, will drive the creation of dynamic and responsive web designs. Operating within EST and GMT work times, you will utilize CI/CD frameworks and pipelines to streamline development processes, emphasizing performance optimization techniques such as FCP, CLS, and LCP. Your familiarity with Shopify site architecture and integration with third-party apps like Algolia and Boost will enable you to implement custom solutions tailored to clients' needs.
Additionally, your strong grasp of Liquid code and Shopify JS APIs will facilitate troubleshooting and resolution of front-end technical challenges, ensuring cross-browser compatibility and adherence to performance budgets. Your ability to work both independently and collaboratively, coupled with a keen attention to detail and problem-solving skills, will be invaluable in driving the success of our projects. While certification and experience with Shopify Plus and API development are advantageous, your dedication to understanding technical architecture and delivering end-to-end project implementations will be key to your success in this role.
Key Responsibilities:
• Expert in Shopify theme development and customization
• An expertise in HTML5, SASS/CSS3 and JavaScript and expertise with VueJS or React
• Willing to work on EST and GMT work time.
• Experience in CI/CD frameworks and using pipeline.
• Experience in Performance optimization, performance budget, FCP, CLS, LCP
• Experience of Shopify site Architecture and its custom integration with Third party apps like Algolia, Boost etc...
• Working knowledge in code optimization and improving site performance in shopify theme.
• Ability to understand anything that uses JSON REST/ GraphQL API.
• Proficiency in working with different Shopify JS APIs, Strong in Liquid code.
• Offering expertise in troubleshooting and resolution of front-end-related technical problems, guiding the team through complex technical challenges.
• A thorough understanding of Shopify admin/backend system and cross-browser compatibility issues
• A strong understanding of responsive web design techniques
• Should be able to do unit testing.
• Excellent problem-solving skills and attention to detail
• Ability to work independently and in a team environment
Required Qualifications:
• Shopify certification
• Experience with Shopify Plus store development.
• Strong in API development and customization, App Development using PHP/Laravel or any scripting language.
• Hands on experience in integrating API/web services.
• Shopify End to end project implementation.
• Ability to understand the technical architecture design of the solution
Data Quality Engineer
Engineering - Hyderabad, Telangana
About Gradera — Digital Twin & Physical AI Platform
At Gradera, we are building a next-generation Digital Twin and Physical AI platform that enables enterprises to model, simulate, and optimize complex real-world systems. Our work brings together strategy, architecture, data, simulation, and experience design to power decision-making across large-scale operational environments such as manufacturing, logistics, and supply chain networks.
This platform-led initiative applies AI-native execution, advanced simulation, and governed orchestration to help organizations test scenarios, predict outcomes, and continuously improve performance. We operate with an enterprise-first mindset prioritizing reliability, transparency, and measurable business impact as we build intelligent systems that scale beyond a single industry or use case.
Data Quality Engineer
Overview
We are seeking a detail-oriented Data Quality Engineer to ensure the integrity, accuracy, and reliability of data powering our digital twin and AI platforms. You will design and implement data quality frameworks, build automated validation pipelines, and establish quality metrics that enable trusted, simulation-ready data products. This role is critical to ensuring that operational decisions and ML models are built on a foundation of high-quality, governed data.
Our core data quality stack includes:
Data Quality Frameworks
- Delta Live Tables expectations for declarative quality enforcement
- Great Expectations for comprehensive data validation
- Databricks data profiling and quality monitoring
Platform & Tools
- Databricks SQL and PySpark for quality checks at scale
- Unity Catalog for lineage tracking and governance compliance
- Python for custom validation logic and anomaly detection
Observability
- Quality metrics dashboards and alerting
- Data profiling and statistical analysis
- Anomaly detection and drift monitoring
Key Responsibilities
- Design and implement data quality frameworks using Delta Live Tables expectations and Great Expectations
- Build automated data validation pipelines that enforce quality standards at ingestion and transformation stages
- Develop data profiling processes to understand data distributions, patterns, and anomalies
- Define and track data quality metrics (completeness, accuracy, consistency, timeliness, validity)
- Implement anomaly detection mechanisms to identify data drift and quality degradation
- Create quality dashboards and alerting systems for proactive issue identification
- Collaborate with data engineers to embed quality checks into ETL/ELT pipelines
- Partner with data architects to establish data quality standards and governance policies
- Investigate and perform root cause analysis for data quality issues
- Document data quality rules, thresholds, and remediation procedures
- Support data certification processes for simulation-ready and ML-ready datasets
- Drive continuous improvement in data quality practices and tooling
Preferred Qualifications
- 6+ years of experience in data engineering or data quality roles, with 3+ years focused on data quality
- Track record of implementing enterprise-scale data quality frameworks
- Experience with Lakehouse architectures (Delta Lake, Iceberg)
- Familiarity with real-time data quality monitoring for streaming pipelines
- Experience working in agile, cross-functional teams
Highly Desirable
- Experience with data quality for digital twin or simulation platforms
- Familiarity with operational state data validation and temporal consistency checks
- Experience with graph data quality validation (Neo4j or similar)
- Exposure to ML data quality (feature validation, training data quality)
- Experience with data observability platforms
- Exposure to industrial domains such as Manufacturing, Logistics, or Transportation is a plus
Location: Hyderabad, Telangana
Department: Engineering
Employment Type: Full-Time
Key Responsibilities:
- Design, develop, and execute automated test scripts for trading applications.
- Work with product owners and business analysts to understand and write the acceptance test cases.
- Collaborate with developers, product managers, and other stakeholders to understand requirements and create test plans.
- Perform regression, performance, and end to end testing to ensure software reliability.
- Identify, document, and track defects using appropriate tools and methodologies.
- Maintain and enhance existing test automation frameworks for both frontend and backend.
- Report on coverage, functionality, defect aging, closure reports to the stakeholders so that they know the stability of releases.
- Integrate automation cases into CI/CD pipelines
Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Proven 3+ Years experience in automation testing for web and backend applications.
- Strong knowledge of testing frameworks (e.g., Selenium, Cypress, JUnit, TestNG, Playwright).
- Experience with API testing tools (e.g., Postman, SoapUI, RestAssured).
- Familiarity with programming languages such as Java, Python, or JavaScript.
- Understanding of basic SQL queries to validate data in the databases
- Understanding of CI/CD processes and tools (e.g., Jenkins, GitLab CI).
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
- Prior experience with trading applications or core financial services related applications is a big plus
Responsibilities:
• Develop and deploy SailPoint IIQ/IDN implementation(s).
• Contribute to requirements gathering, technical design & test case preparation activities.
• Develop code and perform unit testing in line with the requirements and design, for client-specific
use cases.
• Perform integration testing, debugging and troubleshooting issues, including interactions with the
technology vendor when needed.
• Assist client(s) with execution of user acceptance testing.
• Support client(s) with readiness for deployment and actual deployment, followed by hypercare
post-production.
• Enable knowledge transfer and handover to the client.
• Adhere to and implement security best practices throughout the lifecycle of an engagement.
Educational Qualification:
• Bachelor or Master degree in related field (CS/IT) or equivalent work experience.
Job Qualifications:
• Good understanding of Identity & Access Management concepts.
• 3 to 5 years of relevant experience in implementing Identity and Access Management solutions
using SailPoint IdentityIQ or IdentityNow is a must.
• IdentityIQ Engineer, IdentityIQ Architect, or IdentityNow Engineer certification(s) are good to
have.
• Good understanding of Object-Oriented Programming concepts.
• Implementation experience with SailPoint IIQ 7.2 and above versions or IdentityNow on features:
- On-boarding of new applications using native connectors and custom API
- Rules - Connector, Aggregation & Provisioning Rules
- User Lifecycle requests and Lifecycle events
- Custom Tasks and Reports
- Roles, SoD Policies & Certifications
• Good Java programming skills; Java certification is an added advantage.
• Strong analytical skills and excellent verbal and written communication skills.
- To facilitate the efficient administration of bids, bid reviews and associated bid documentation.
- Tender preparation from preparation to delivery, manages the entire tendering process, ensuring timely and accurate completion of tender proposals and submissions, complying with regulations, and maintaining records and databases to improve future submissions.
- Maintains contractual records and documentation such as receipt and control of all contract correspondence, ..
- Develops marketing strategies to promote and charter hire vessels, conducts market research, analyzes trends and provides data to senior Managers for development of Strategies.
- Content Creation and Management which includes developing and managing marketing materials, including brochures, website content, and social media content.
- Identification of vessels from domestic & foreign markets.
- Interaction with Company clients & customers.
- Market Research and Analysis which includes Conducting market research to identify target audiences, analyze trends, and evaluate competitor activities.
- Data Analysis and Reporting which is Monitoring the performance of marketing campaigns, analyze data, and generate reports to track progress and measure ROI ( Risk of Investment) for Vessel RFQ.
- Relationship Management:
Build and maintain relationships with clients, partners, and industry stakeholders.
- Industry Knowledge:
- Stay up to date on industry trends, regulations, and best practices in the vessel marketing , upgrading technical requirements of Clients.
- Coordination with Internal Depts
- Ability to take up for Cross-Functional Duties.
In short, the candidate must be suitable for the following:
- Responsible for Marketing and Chartering of Offshore Supply Vessels to local & overseas clients.
- Develop executable business development plans for assigned region.
- Able to engage global industry contacts and cultivate strong relations with clients / partners.
- Provide weekly marketing report on business negotiations, proposals and tenders handle by individual and team.
- Solicit and collate market intelligence for development of business strategy.
- Coordinate and work with internal cross-functional departments & companies to support the execution and delivery of projects.
- Timely payment collection on their projects.
- Coordinate and ensure timely delivery of vessel to client.
Job Description:
Responsibilities
· Having E2E responsibility for Azure landscape of our customers
· Managing to code release and operational tasks within a global team with a focus on automation, maintainability, security and customer satisfaction
· Make usage of CI/CD framework to rapidly support lifecycle management of the platform
· Acting as L2-L3 support for incidents, problems and service request
· Work with various Atos and 3rd party teams to resolve incidents and implement changes
· Implement and drive automation and self-healing solutions to reduce toil
· Enhance error budgets and hands on design and development of solutions to address reliability issues and/or risks
· Support ITSM processes and collaborate with service management representatives
Job Requirements
· Azure Associate certification or equivalent knowledge level
· 5+ years of professional experience
· Experience with Terraform and/or native Azure automation
· Knowledge of CI/CD concepts and toolset (i.e. Jenkins, Azure DevOps, Git)
· Must be adaptable to work in a varied, fast paced exciting, ever changing environment
· Good analytical and problem-solving skills to resolve technical issues
· Understanding of Agile development and SCRUM concepts a plus
· Experience with Kubernetes architecture and tools a plus
Candidate should have Max 4+ yrs experience in Python automation testing
With Robot framework and selenium
Less exp couldnt consider.
Responsibilities:
- Setup and integrate various APIs
- Dealing with scripts written in different languages (PHP, Node.js, Python, PhantomJS, Ruby)
- Must be an R & D specialist
- Keep up-to-date with advanced concepts
- Adaptable to learn new technologies
- Knowledge on Linux and its commands
- Fast in debugging the code
- Should be a quick learner
- Good at communication
Requirements
- Minimum 3 Years of experience in PHP7
- 1 Year experience in Node.js/Python
- Good work experience in PHP7
- Strong knowledge in OOPS.
- Proven knowledge on REST and SOAP API integrations
- Experience in composer installations
- Strong in Javascript and Node.js concepts.
- Good knowledge of MongoDB and MySql
- Knowledge of versioning tools (SVN/GIT)
- Having knowledge on Amazon AWS and Google Cloud Platform is an added advantage
- Having knowledge of elastic search is an added advantage
- Good to have knowledge on p threads
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud

Knowledge Hut Solutions Pvt.Ltd.An edu-tech company(product)
Job Description
We are looking for an experienced and talented UI designer to design and shape unique, user-centric products and experiences. You will be able to make deliberate design decisions and to translate any given user-experience journey into a smooth and intuitive interaction. The ideal candidate should have experience of working in agile teams, with developers and UX designers.




