
Similar jobs

Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Skill Examples:
- Proficiency in SQL Python or other programming languages used for data manipulation.
- Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
- Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
- Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
- Experience in performance tuning.
- Experience in data warehouse design and cost improvements.
- Apply and optimize data models for efficient storage retrieval and processing of large datasets.
- Communicate and explain design/development aspects to customers.
- Estimate time and resource requirements for developing/debugging features/components.
- Participate in RFP responses and solutioning.
- Mentor team members and guide them in relevant upskilling and certification.
Knowledge Examples:
- Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
- Proficient in SQL for analytics and windowing functions.
- Understanding of data schemas and models.
- Familiarity with domain-related data.
- Knowledge of data warehouse optimization techniques.
- Understanding of data security concepts.
- Awareness of patterns frameworks and automation practices.
Additional Comments:
# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026
Project Overview:
Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.
The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.
Design, build, and maintain scalable data pipelines using Databricks and PySpark.
Develop and optimize complex SQL queries for data extraction, transformation, and analysis.
Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).
Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.
Ensure data quality, performance, and reliability across data workflows.
Participate in code reviews, data architecture discussions, and performance optimization initiatives.
Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.
Key Skills:
Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).
Excellent problem-solving, communication, and collaboration skills.
Skills: Databricks, Pyspark & Python, Sql, Aws Services
Must-Haves
Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)
Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).
******
Notice period - Immediate to 15 days
Location: Bangalore
Requirements
We are seeking an experienced OIC Lead to drive and manage B2B technical integrations and ensure seamless connectivity across enterprise applications. The role requires strong expertise in Oracle Integration Cloud (OIC), SOA, and related integration platforms, with the ability to collaborate effectively with stakeholders and lead integration initiatives.
Key Responsibilities
- Lead the design, development, testing, and support of enterprise application integrations using OIC, SOA, SOACS, and MFT.
- Oversee end-to-end B2B integration processes, including EDI and cloud-based integrations.
- Collaborate with cross-functional teams, including business stakeholders, technical teams, and 3rd party vendors, to deliver integration solutions.
- Provide technical leadership and best practices for ERP integration across platforms such as Oracle Cloud ERP and SAP.
- Apply domain expertise in business processes such as Procure-to-Pay, Order-to- Cash, logistics, procurement, and supply chain.
- Identify opportunities for process improvement, implement automation, and enhance integration efficiency.
- Qualifications
- Minimum 5+ years of professional experience, with at least 3+ years in enterprise- scale integration environments.
- Strong proficiency in integration platforms: Oracle SOA, SOACS, OIC, and MFT.
- Solid knowledge of communication protocols (FTP, SFTP, AS2) and data formats
- (XML, XML schemas).
- Experience in SDLC methodologies and project management within integration projects.
- Strong problem-solving, analytical, and troubleshooting skills.
- Familiarity with cloud integration services, API management platforms, and hybrid
- integration models is highly desirable.
- Proven ability to lead integration projects and mentor team members.
Role Overview:
We are seeking a Senior Consultant with deep expertise in storage and backup infrastructure, data migration strategies, and enterprise IT consulting. This individual will lead customer engagements, build methodologies, and deliver value through the ZENfra platform.
Experience in product implementation and customer support management is a strong plus, enabling this role to bridge consulting with hands-on delivery and post-deployment success.
Key Responsibilities:
Consulting & Strategy:
Engage with enterprise customers to understand their infrastructure, pain points, and transformation goals.
Design and recommend migration strategies and modernization roadmaps for storage and backup environments.
Develop reusable methodologies and frameworks tailored to customer needs.
ZENfra Enablement:
Translate customer requirements into actionable insights using the ZENfra platform.
Build and deliver ZENpacks (customized solution bundles) including reports, dashboards, and automation scripts.
Collaborate with ZENfra product and engineering teams to enhance platform capabilities based on field feedback.
Product Implementation & Support:
Lead or support ZENfra platform deployments across customer environments.
Ensure smooth onboarding, configuration, and integration of ZENfra components.
Manage customer support escalations, ensuring timely resolution and customer satisfaction.
Provide feedback to product teams to improve usability and performance.
Leadership & Delivery:
Lead and mentor a team of consultants and engineers across multiple customer engagements.
Prioritize tasks and manage delivery timelines while maintaining high customer satisfaction.
Ensure quality and consistency in deliverables across projects.
Customer Success:
Act as a trusted advisor to customers, helping them derive maximum value from VTG and ZENfra offerings.
Conduct workshops, training sessions, and executive briefings as needed.
Required Skills & Experience:
- 10+ years of experience in IT consulting with a focus on storage, backup, and data protection technologies.
- Proven experience in data centre migrations, cloud transitions, and infrastructure assessments.
- Strong leadership experience managing cross-functional teams and customer engagements.
- Hands-on experience with product implementation and customer support management.
- Excellent communication and stakeholder management skills.
- Familiarity with tools like ZENfra, ZENfra Ignite, or similar platforms is a strong plus.
- Ability to create structured reports, methodologies, and reusable assets.
- Experience working with global customers and managing multiple priorities.
Preferred Qualifications:
Certifications in storage technologies (e.g., NetApp, Dell EMC, Veritas, Commvault).
Exposure to cloud platforms (AWS, Azure, GCP) and hybrid infrastructure.
Experience with automation and scripting (Python, PowerShell) is a plus.
Job Title - Full Stack Trainer
skills Required - Java, Python, HTML, CSS, Nodejs, SQL, React ,GIT,SVN
Packages- 6LPA-10LPA
Experience- 3-4 Years
Location-Coimbatore.
We are looking for a Full Stack Engineer for a client of ours, that is headquartered in USA and is operating in the field of health tech space, offering an innovative solution for dominate health issues. The company is well funded and backed by investors like Sequoia Capital.
They are looking for people who are interested to join their company and help them make an impact on their product development and growth.
In this role you will be part of the Product Engineering Team, and will be responsible for working with cross functional teams.
Experience Required
5+ years experience, having worked on both frontend and backend development - hands on development experience from front end, middle tier and back end.
3+ years of experience building Restful web service using node.js
3+ years of experience for writing batch/cron jobs using Python and shell scripting
3+ years of experience in web application development using javascript and ReactJs
Have understanding of javascript, Typescript, HTML, CSS, JSON, Rest based applications
Experience in database techs - mongodb, mysql, redis, elastic search
Experience of building applications deployed on the cloud using AWS, Docker and kubernetes
Good to have : - understanding of code versioning tools like Git
Experience or understanding of JS based build tools like Grunt, Gulp, Bower, Webpack and NPM
Good communication, willingness to work in a diverse team, and a fast paced environment would be a requirement to be successful in this role.
Education Experience
Degree in Computer science would be preferred / equivalent professional experience in product companies.
FURIOUS FOX is looking for Embedded Developers with strong coding skills in C & C++ as well as experience with Embedded Linux.
Experience : (Minimum 7-10 yrs)
• Experienced in edge processing for connected building / industrial / consumer
appliances / automotive ECU
• Have a good understanding of IoT platforms and architecture
• Deep experience in operating systems eg: Linux, freeRTOS / kernel development/device drivers.
/ sensor drivers
• Have experience with various low-level communication protocols, memory devices, messaging
framework etc.
• Have a deep understanding of design principles, design patterns, container preparations
• Have developed hardware, OS abstraction layers, and sensor handlers services to manage various BSP, os standards
• Have experience with Python edge packages.
• Have a good understanding about IoT databases for edge computing
• Good understanding of connectivity application protocols and connectivity SDK for Wi-Fi and BT / BLE
• Experienced in arm architecture, peripheral devices and hardware board configurations
• Able to set up debuggers, configure build environments, and compilers and optimize code and performance.
Skills / Tools:
• Expert at object-oriented programming
• Modular programming
• C / C++ / JavaScript / Python
• Eclipse framework
• Target deployment techniques
• IoT framework
• Test framework
Highlights :
• Having AI / ML knowledge in applications
• Have worked on wireless protocols
• Ethernet / Wi-Fi / Bluetooth / BLE
• Highly exploratory attitude
• willing to venture in and learn new
technologies.
• Have done passionate projects based on self-interest.
- Work with product managers to understand product requirements and make them live.
- Ownership of end to end development
- Startup mindset of getting things done and focussing on business goals
- Proven problem solving skills
- 3+ years Hands on experience in designing and developing applications using server side technology (Java, Spring Boot / Node.js, Express)
- Excellent knowledge of Relational Databases, SQL and ORM technologies
- Good knowledge of design patterns
- Proficiency in REST architecture
- Experience with test-driven development
- Experience with Git/CI/CD/Gradle/Maven
- Inclination towards writing quality and performant code
- Experience in Agile development
- Performance tuning, testing, refactoring and automation
- Experience working with AWS Cloud and Devops technologies (terraform , cloudformation, ansible)
- Experience running a production environment
- 6-9 years of strong development skills in Java JDK 1.8 or above.
- Experience in developing micro services in Spring Boot or Node.js.
- Experience in security, transaction, Idempotency, log tracing, distributed caching, monitoring and containerization requirements of Micro services
- Experience in developing High Cohesion & Loosely Coupled Micro Services.
- Strong acumen in Data Structures, Algorithms, problem-solving and Logical/Analytical skills.
- Thorough understanding design principles and implementation of different type of Design patterns.
- Sound understanding of concepts like Exceptional handling, Serialization/ Deserialization and Immutability concepts, etc.
- Good fundamental knowledge in Enums, Collections, Annotations, Generics, Autoboxing, etc.
- Experience with Multithreading, Concurrent Package and Concurrent APIs
- Basic understanding of Java Memory Management (JMM) including garbage collections concepts.
- Experience in RDBMS or NO SQL databases and writing SQL queries (Joins, group by, aggregate functions, etc.)
- Hands-on experience with JMS
- Hands-on experience in creating RESTful webservices and consuming webservices
- Hands-on experience with Spring Boot and Spring cloud.
- Hands-on experience with any of the logging frameworks (SLF4J/LogBack/Log4j)
- Experience of writing Junit test cases using Mockito / Powermock frameworks.
- Should have practical experience with Maven/Gradle and knowledge of version control systems like Git/SVN etc.
- Good communication skills and ability to work with global teams to define and deliver on projects.
- Sound understanding/experience in software development process, test-driven development.
Additional Information
- Gender-Neutral Policy
- 18 paid holidays throughout the year for NCR/BLR (22 For Mumbai)
- Generous parental leave and new parent transition program
- Flexible work arrangements
- Employee Assistance Programs to help you in wellness and well being
● J2EE with a good understanding of Servlets and JSP
● Experience in Spring Modules – Spring IOC and AOP, Spring Boot (version 2 plus), JDBC
● Expertise in design and development of various web and enterprise – level applications using Java/J2EE
technologies such as Spring, Hibernate, Rest services.
● Web Services (including SOAP, XML, XML Schema, JSON, and REST)
● Tools required: Maven,Eclipse, Github, and Swagger
● Good knowledge of SQL and Redis (NOSql)
● Ability to document requirements and specifications
● Proven work experience as a Software Engineer or Software Developer
● Ability to develop software in Java or any other OOPS languages
● Excellent knowledge of relational databases, SQL and ORM technologies (JPA2, Hibernate)
● Experience in developing Web Applications using at least one popular Web Framework (Spring)
● Experience with test-driven development
● Proficiency in software engineering tools
We’re seeking a qualified Sales Manager to sell Whitehat Jr’s product that our customers have grown to rely on. The Sales Manager will utilise their skills to generate high quality leads, build a strong relationship with customers and close deals. The ideal candidate will be a quick learner with strong negotiating skills, and demonstrate the ability to showcase our offerings in a compelling way.
Role details (please read)
Scope: Exceed targets for New Sales, Referrals or Renewals, in an individual contributor role
Work Location: Remote online work until offices reopen, candidate may be based anywhere in India
Working days: 6 working days with 1 day-off which may be during week
Shifts (subject to change): shift start hour will be after 6 AM and shift end hour before 12 midnight
Mandatory Language Fluency: English, Hindi
Laptop/Wi-fi: candidates to use their own laptops, wi-fi will be reimbursed
Additional Compensation: If applicable, this will be decided basis your allocated shift after you join
Qualifications
-
Proven B2C Sales track record of exceeding targets, >2 years of Sales experience
-
Fluency in English and Hindi, ability and willingness to deliver in a high pressure environment
-
Excellent communication, interpersonal, problem-solving, presentation, and organisational skills.
-
Ability to counsel a parent for the child's future
-
Comfortable with changing shift timings so that we may serve our customers better
-
Graduation is not mandatory for candidates with 3+ years of experience
-
Working knowledge of Salesforce, spreadsheets (Excel, Google Sheets ) and powerpoint











