11+ HiveQL Jobs in Chennai | HiveQL Job openings in Chennai
Apply to 11+ HiveQL Jobs in Chennai on CutShort.io. Explore the latest HiveQL Job opportunities across top companies like Google, Amazon & Adobe.
Role Summary/Purpose:
We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.
Requirements:
- The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
- Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
- Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
- Excellent knowledge in SQL & Linux Shell scripting
- Bachelors/Master’s/Engineering Degree from a well-reputed university.
- Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
- Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
- Ability to manage a diverse and challenging stakeholder community
- Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.
Responsibilities
- Should works as a senior developer/individual contributor based on situations
- Should be part of SCRUM discussions and to take requirements
- Adhere to SCRUM timeline and deliver accordingly
- Participate in a team environment for the design, development and implementation
- Should take L3 activities on need basis
- Prepare Unit/SIT/UAT testcase and log the results
- Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
- Quality delivery and automation should be a top priority
- Co-ordinate change and deployment in time
- Should create healthy harmony within the team
- Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Position Summary
The GCP Senior Data Engineer is responsible for designing, building, optimizing, and maintaining high‑quality data pipelines and cloud‑native data platforms on Google Cloud Platform (GCP). This role partners closely with data analysts, data scientists, business teams, DevOps, and platform engineering to deliver secure, reliable, and scalable data solutions that support Trane Technologies’ digital strategy.
This position requires strong hands‑on engineering capabilities, cloud experience, and a passion for delivering enterprise‑grade data solutions.Qualifications
Education Requirements
• Bachelor’s /Master’s degree in Computer Science, Information Technology, Data Engineering, Software Engineering, or a related technical field.
Experience Requirements
• 7+ years of total experience in data engineering, data platforms, or data‑centric software engineering.
• 5+ years of hands‑on experience designing and building data pipelines, ETL/ELT workflows, and cloud‑native data solutions.
• 3+ years of direct experience with Google Cloud Platform (GCP) using services such as BigQuery, Cloud Storage, Cloud Run, Datafusion, Dataproc, Composer, and Pub/Sub.
• Advanced proficiency in SQL, Python, and PySpark for data processing and transformation.
• Proven experience developing and supporting production‑grade, end‑to‑end cloud data pipelines.
• Expertise in ELT/ETL design, performance optimization, and data transformation frameworks.
• Strong background in monitoring, logging, alerting, and data quality frameworks across distributed systems.
• Experience with DevOps tools and techniques, including CI/CD, GitHub, and related automation practices.
• Demonstrated experience working in global, cross‑functional, and agile development environments.
• Excellent communication, problem‑solving, and analytical skills with the ability to collaborate across technical and non‑technical teams.
• GCP certification preferred (e.g., Google Cloud Professional Data Engineer).
Core Responsibilities
1. Data Engineering & Cloud Development (Core Responsibilities)
• Design, build, and maintain scalable data processing systems on Google Cloud Platform (GCP).
• Develop end‑to‑end data pipelines to support ingestion, transformation, and storage.
• Build ELT/ETL workflows, database objects, and cloud‑native data orchestration solutions.
• Optimize pipelines for performance, scalability, cost efficiency, and reliability.
• Ensure robust monitoring, alerting, and data quality assurance across distributed systems.
• Implement secure data architectures following Trane Technologies’ standards.
2. Collaboration & Analytics Enablement
• Work closely with data analysts and data scientists to understand business requirements and translate them into scalable data solutions.
• Support analytical use cases, data modelling, and enable self‑service data consumption.
• Partner across global teams, including business stakeholders, product owners, and data governance.
3. DevOps / Platform Engineering Partnership
• Partner with DevOps and platform engineering teams to ensure data infrastructure is secure, reliable, and highly available.
• Apply working knowledge of CI/CD practices (code versioning, automation, testing frameworks) to support cloud‑based deployments.
• Collaborate on environments, pipelines, and promote code through dev → test → prod.
4. Engineering Excellence & Ways of Working
• Participate in code reviews and maintain global coding standards.
• Follow structured engineering methods, documentation practices, and release processes.
• Contribute effectively in distributed, agile development teams across multiple time zones.
• Communicate clearly with both technical and non‑technical stakeholders.
5. Innovation & Continuous Improvement
• Research, evaluate, and propose new tools, technologies, and design patterns.
• Stay current with cloud, data engineering, and analytics trends relevant to enterprise‑scale environments.
• Apply continuous improvement practices to enhance reliability, quality, and developer productivity.

a global provider of Business Process Management and Outsourcing solutions company
Appian Developer / Sr Appian Developer
· Extensive experience in Appian BPM application development
· Knowledge of Appian architecture and its objects best practices
· Participate in analysis, design, and new development of Appian based applications
· Team leadership and provide technical leadership to Scrum teams
· Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced
by team
· Build applications: interfaces, process flows, expressions, data types, sites, integrations, etc.
· Proficient with SQL queries and with accessing data present in DB tables and views
· Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart
services, integration services and web services
· Experience working with different Appian Object types, query rules, constant rules and expression
rules
Qualifications
· At least 6 years of experience in Implementing BPM solutions using Appian 19.x or higher
· Over 8 years in Implementing IT solutions using BPM or integration technologies
· Certification Mandatory- L1 and L2 a
· Experience in Scrum/Agile methodologies with Enterprise level application development projects
· Good understanding of database concepts and strong working knowledge any one of the major
databases e g Oracle SQL Server MySQL
Additional information
Skills Required
· Appian BPM application development on version 19.x or higher
· Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
· Good leadership skills and the ability to lead a team of software engineers technically
· Experience working in Agile Scrum teams
· Good Communication skills
IMP: Please read through before applying!
Nature of role: Full-time; On-site
Location: Thiruvanmiyur, Chennai
Responsibilities:
Build and manage automation workflows using n8n, Make (Integromat), Zapier, or custom APIs.
Integrate tools across JugaadX, WhatsApp, Shopify, Meta, Google Workspace, CRMs, and internal systems.
Develop and maintain scalable, modular automation systems with clear documentation.
Integrate and experiment with AI tools and APIs such as OpenAI, Gemini, Claude, HeyGen, Runway, etc.
Create intelligent workflows — from chatbots and lead scorers to content generators and auto-responders.
Manage cloud infrastructure (VPS, Docker, SSL, security) for automations and dashboards.
Identify repetitive tasks and convert them into reliable automated processes.
Build centralized dashboards and automated reports for teams and clients.
Stay up-to-date with the latest in AI, automation, and LLM technologies, and bring new ideas to life within Jugaad’s ecosystem.
Requirements:
Hands-on experience with n8n, Make, or Zapier (or similar tools).
Familiarity with OpenAI, Gemini, HuggingFace, ElevenLabs, HeyGen, and other AI platforms.
Working knowledge of JavaScript and basic Python for API scripting.
Strong understanding of REST APIs, webhooks, and authentication.
Experience with Docker, VPS (AWS/DigitalOcean), and server management.
Proficiency with Google Sheets, Airtable, JSON, and basic SQL.
Clear communication and documentation skills — able to explain technical systems simply.
Who You Are:
A self-starter who loves automation, optimization, and innovation.
Comfortable building end-to-end tech solutions independently.
Excited to collaborate across creative, marketing, and tech teams.
Always experimenting with new AI tools and smarter ways to work.
Obsessed with efficiency, scalability, and impact — you love saving time and getting more done with less.
What You Get:
A strategic and hands-on role at the intersection of AI, automation, and operations.
The chance to shape the tech backbone of Jugaad and influence how we work, scale, and innovate.
Freedom to experiment, build, and deploy your ideas fast.
A young, fast-moving team where your work directly drives impact and growth.
Looking for a passionate Production Support Lead and team player who wants to learn, contribute and bring fun & energy to the team. We are a friendly startup where we provide opportunities to explore and learn a lot of things(new technology/tools etc.,) in building quality products using best-in-class technology.
Responsibilities :
· Customer Relationship Management
· Incident Management. Manage ticket queue and resolve it in timely manner.
· Analyzing the incidents and either responding to the end user with a solution or escalating it to the other IT teams.
· Troubleshoot minor and major system problems in a timely manner and escalate to L3 support when necessary.
· SLA Management
· Develop and maintain accurate technical and user documentation.
· Working with QA to ensure the quality and timing of new release deployments.
Skills/Experience :
· Strong analytical and problem-solving skills and interest in learning new things will be the key.
· Excellent interpersonal skills handling internal and external customers
· About 3 years of professional experience in providing product support in leading BFSI sector organisations
· Experience in any DB (SQL/noSQL)
· Testing Exposure will be an added advantage
Skills – Jboss, DevOps, ServiceNow, Windows Server.
JD - Application Maintenance -- Must have -- Installation and configuration of Custom/Standard Software e.g., FileZilla, JDK, OpenJDK Installation and configuration of JBOSS/Tomcat Server, Configuration of HTTPS certificate in JBOSS/Tomcat, Windows Event Viewer/IIS Log/ Windows Security /Active Directory, how to set Environment variable, Registry value, etc. Nice to Have --- Basics of Monitoring Knowledge of PowerShell, MS Azure Devops, Deploy and Configure application, how to check Last installed version of any software/patch, ServiceNow , ITIL , Incident Management , Change Management

It is the leading casino game software Providers in India.
- Strong proficiency in MySQL database management
- Table design including normalization
- Database backups and recovery is a plus
- Experience with recent versions of MySQL
- Understanding of MySQL’s underlying storage engines, such as InnoDB and MyISAM
- Good Experience with Procedure and events in MySQL
- Experience with replication configuration in MySQL
- Proficient in writing and optimizing SQL statements
- Knowledge of MySQL features, such as its event scheduler
- Knowledge of limitations in MySQL and their workarounds in contrast to other popular relational databases
- Strong MySQL Queries tuning skills
- Write scripts using Shellscript to automate manual administrative tasks
- Maintain backups and perform point-in-time restorations
- Creating tables, alter column, and store procedure.
- Good Experience in Creating index and query optimizing
- Experience in MongoDB to SQL migration



