
Similar jobs
Tech Lead
Location: On-site, Panaji, Goa
About Joyful
Joyful is a leading AI-powered stakeholder communication management platform for voice of stakeholder analysis and contact center solutions. Our mission is to use AI to make all interactions between a company and its stakeholders joyful by removing friction. Joyful is a part of Germinait Solutions Pvt. Ltd.
Our Joyful modules, Engage and Listen, help businesses understand and manage stakeholder interactions across digital channels. We enable companies to provide exceptional customer experiences while maximizing the productivity and efficiency of their support teams, all through one seamless platform.
At Joyful, we're committed to fostering meaningful interactions between stakeholders and brands by providing actionable insights, personalized replies, and a joyful experience for customers, users, and employees alike.
The Opportunity
We're seeking a talented Technical Lead who will play a pivotal role in designing, building, and scaling Joyful’s AI-powered products. In this role, you will lead technical architecture decisions, guide the engineering team, and ensure our platform remains robust, scalable, and delightful to use. You’ll work closely with product, design, and business teams to deliver impactful features while maintaining high code quality and performance.
In addition to technical leadership, you’ll champion Vibe Coding—our approach to writing clean, collaborative, and joyful code that engineers love to create and maintain. This means setting the tone for high-quality, maintainable code while fostering an environment where building software is an energizing, shared experience.
What You'll Do
- Technical Leadership & Architecture:
- Lead the design and implementation of scalable, high-performance, and secure software solutions for Joyful’s Engage and Listen platforms.
- Define technical roadmaps, architecture patterns, and coding best practices
- Ensure adherence to software development standards and conduct regular code reviews
- Make critical build-versus-buy and technology adoption decisions
- Team Management & Collaboration:
- Mentor and coach a team of engineers, fostering a culture of learning, ownership, and innovation
- Collaborate with cross-functional teams (product, UX, QA) to align technical solutions with business goals
- Drive agile development practices, including sprint planning, retrospectives, and backlog prioritization
- Identify and resolve bottlenecks in development, deployment, and delivery processes
- Vibe Coding Culture:
- Lead by example in practicing Vibe Coding—writing code that is clean, well-structured, and joyful to work with
- Encourage pair programming, open collaboration, and frequent peer reviews
- Maintain high coding standards while keeping the process creative and energizing for the team
- Promote a development culture where engineers feel motivated, supported, and proud of the work they ship
- Hands-On Development:
- Contribute directly to code when needed—particularly for complex modules, integrations, and performance optimization
- Oversee the development of APIs, microservices, and integrations with third-party platforms
- Ensure robust CI/CD pipelines, test automation, and monitoring systems are in place
- Innovation & Continuous Improvement:
- Stay ahead of emerging technologies in AI, cloud, and enterprise communication platforms
- Propose and implement innovative solutions to improve product performance, security, and maintainability
- Drive proof-of-concepts for new features or architectural improvements
- Ensure systems are designed for high availability, scalability, and disaster recovery
What You'll Need
- 6+ years of professional software development experience, with at least 2 years in a technical leadership role
- Proven expertise in Java (Spring Boot), REST APIs, and microservices architecture
- Hands-on experience with cloud platforms (AWS, Azure, or GCP) and container orchestration (Docker, Kubernetes)
- Strong understanding of relational and NoSQL databases
- Experience building scalable, high-availability systems in B2B SaaS or AI-powered products
- Solid knowledge of software design patterns, performance optimization, and security best practices
- Familiarity with frontend technologies (Angular, React, or similar) is a plus
- Excellent communication and stakeholder management skills
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
- Passion for AI technologies and building software in a positive, high-energy coding environment
Why Join Joyful?
- Be at the forefront of the AI revolution in stakeholder management
- Lead impactful projects with a culture of Vibe Coding—where high-quality engineering meets great team energy
- Work with a highly skilled and passionate team
- Enjoy significant growth opportunities in a rapidly scaling organization
- A culture that values innovation, ownership, and collaborative problem-solving
- Work from our beautiful office in Goa, combining cutting-edge tech work with a high quality of life
At Joyful, we believe strong technical leadership is key to making every interaction seamless, intelligent, and joyful. If you’re excited about solving complex technical challenges while mentoring teams to deliver exceptional products, we’d love to meet you!
We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in
building large-scale data pipelines, real-time streaming solutions, and batch/stream
processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and
AWS Big Data services, with hands-on experience in implementing CDC (Change Data
Capture) pipelines and integrating multiple data sources and sinks.
Responsibilities
- Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
- Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
- Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
- Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
- Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
- Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
- Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
- Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
- Implement monitoring, logging, and alerting for critical data pipelines.
- Follow best practices for data security, compliance, and cost optimization in cloud environments.
Required Skills & Experience
- Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
- Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
- Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
- CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
- AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
- ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
- Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
- Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
- Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
- Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
- Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
- Functions for workflow orchestration.
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Experience in large-scale data lake / lake house architectures.
- Knowledge of data warehousing concepts and query optimisation.
- Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
- Exposure to ML/AI data pipelines is a plus.
Tools & Technologies (must-have exposure)
- Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
- Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
- Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
- Programming & Scripting: Python, SQL, Bash
- Orchestration: Airflow / Step Functions
- Version Control & CI/CD: Git, Jenkins/CodePipeline
- Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi
Urgent Hiring!!!
We are looking for a 2D Animator/ Motion Graphic Designer
Job Description:
- Motion Graphic Artist with experience to develop our company's overall layout and production animation.
- Motion Graphic Artist: Photoshop / After Effects / Illustrator. Take the design brief to record requirements and needs.
- Think creatively and develop innovative concepts and animation.
Openings: 3 positions (Females only)
Experience: 0.6 to 2 years
Location: Surat (onsite & full-time only)
Job Description: Data Engineer
Position Overview:
Role Overview
We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.
Key Responsibilities
· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
· Ensure data quality and consistency by implementing validation and governance practices.
· Work on data security best practices in compliance with organizational policies and regulations.
· Automate repetitive data engineering tasks using Python scripts and frameworks.
· Leverage CI/CD pipelines for deployment of data workflows on AWS.
Oracle Cloud HCM Technical
Experience- 6 – 12 years
Location- Hyderabad, Chennai, Bangalore, Pune, Kolkata, Noida, Gurgaon, Mumbai
1.Practitioner/expert level knowledge in at least 2 of the following tools: HDL, HCM Extracts,
BIP Reports, OTBI Reports, REST Web services, Fast Formula
2.In depth knowledge in at least two of HCM Cloud Modules:
∙Core HR
∙Absences
∙Benefits
∙Payroll
∙Workforce Compensation
∙OTL
∙Taleo
∙OLC
∙Oracle recruit.
1.Cloud -certification will be an advantage
2.Very good communication skills and excellent problem-solving skills.
3.Direct client interaction experience will be an advantage
Responsibilities
- Develop new user-facing features using React.js and RESTful APIs using Node.js and MongoDB
- Build reusable code and libraries for future use
- Optimize applications for maximum speed and scalability
- Collaborate with team members, e.g designer, product and other stakeholders to ensure quality in product.
- Ensure the technical feasibility of UI/UX designs
- Manage and maintain cloud infrastructure on AWS
Qualifications
- At least 3-6 years of experience in MERN stack
- Proficiency with React.js, Node.js, MongoDB, and Express.js
- Familiarity with AWS services such as EC2, S3, and RDS, SQS, Lambda
- Understanding of RESTful API design principles
- Understanding of Agile software development methodologies
- Strong problem-solving and analytical skills

• Cctv cameras ,Fire alarm, Motion sensors, Video door phone, IP camera ,AC Lift, video door phone, Smart lights, Switch controllers and many more
• Biometric locks/ motion sensors/ video door phone /fire alarm/ smart switches/ motion lights/ wardrobe sensors.
- Build our strategic affiliate network through recruiting new affiliate & Influencer partners and actively managing relationships with our existing strategic partners, to provide quality business leads and cost-effective Lead acquisition.
- Setup, tracking, reporting, and manage affiliate program activity.
- Full management of lifecycle marketing for affiliate & Influencer
- partners.
- Generate monthly, weekly and quarterly reports along with analysis
- Ensure invoicing and payment schedules are current and on time, providing forecasting support for the business.
- Continually measure and optimize the overall program ROI
- Doing industry research and sourcing opportunities for overall business development
- Build a repository for effective sales pitches.
- Meet monthly, quarterly, annual revenue targets
Desired Profile of the candidate
- A self-starter who's comfortable working autonomously
- Experience in using affiliate tracking platforms
- Ahead for numbers, easily able to extract key insights from data
- Has to experience in creating pivots and using MS Excel overall
- Team player with a drive to help other people
- Ability to analyze data and give recommendations
- Work with multiple stakeholders for the flow of information and business process
- A passion for achieving targets











