
Similar jobs

Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Skill Examples:
- Proficiency in SQL Python or other programming languages used for data manipulation.
- Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
- Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
- Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
- Experience in performance tuning.
- Experience in data warehouse design and cost improvements.
- Apply and optimize data models for efficient storage retrieval and processing of large datasets.
- Communicate and explain design/development aspects to customers.
- Estimate time and resource requirements for developing/debugging features/components.
- Participate in RFP responses and solutioning.
- Mentor team members and guide them in relevant upskilling and certification.
Knowledge Examples:
- Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
- Proficient in SQL for analytics and windowing functions.
- Understanding of data schemas and models.
- Familiarity with domain-related data.
- Knowledge of data warehouse optimization techniques.
- Understanding of data security concepts.
- Awareness of patterns frameworks and automation practices.
Additional Comments:
# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026
Project Overview:
Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.
The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.
Design, build, and maintain scalable data pipelines using Databricks and PySpark.
Develop and optimize complex SQL queries for data extraction, transformation, and analysis.
Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).
Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.
Ensure data quality, performance, and reliability across data workflows.
Participate in code reviews, data architecture discussions, and performance optimization initiatives.
Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.
Key Skills:
Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).
Excellent problem-solving, communication, and collaboration skills.
Skills: Databricks, Pyspark & Python, Sql, Aws Services
Must-Haves
Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)
Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).
******
Notice period - Immediate to 15 days
Location: Bangalore
- Minimum of 5 years of experience in B2B enterprise sales with atleast 1-2years in a start-up.
- Should have a proven track record of closing complex deals and managing large accounts.
- Strong understanding of the full sales lifecycle, from prospecting to closing, with expertise in consultative selling and solution-based sales
- Strong negotiation and deal-closing abilities, with the capacity to build trust and rapport with high-level stakeholders.
- Strong experience using CRM systems to track and manage sales pipelines, opportunities, and client interactions.
- Should have done a 0 to 1 sales growth story
- Must be handling West region currently and candidates from Mumbai preferred.
About the company:
Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.
About the Role:
We are seeking an experienced Technical Data Professional with hands-on expertise in designing and implementing dimensional data models using Erwin or any dimensional model tool and building SQL-based solutions adhering to Data Vault 2.0 and Information Mart standards. The ideal candidate will have strong data analysis capabilities, exceptional SQL skills, and a deep understanding of data relationships, metrics, and granularity of the data structures.
Qualifications:
- Bachelor’s degree in computer science, Information Technology, or a related field.
- Certifications with related field will be an added advantage.
Key Competencies:
1.Technical Expertise:
- Proficiency in Erwin for data modeling.
- Advanced SQL skills with experience in writing and optimizing performance driven queries.
- Hands-on experience with Data Vault 2.0 and Information Mart standards is highly preferred.
- Solid understanding of Star Schema, Facts & Dimensions, and Business Unit (BU) architecture.
2. Analytical Skills:
- Strong data analysis skills to evaluate data relationships, metrics, and granularities.
- Capability to troubleshoot and resolve complex data modeling and performance issues.
3.Soft Skills:
- Strong problem-solving and decision-making skills.
- Excellent communication and stakeholder management abilities.
- Proactive and detail-oriented with a focus on delivering high-quality results.
Key Responsibilities:
1. Dimensional Data Modeling:
- Design and develop dimensional data models using Erwin with a focus on Star Schema and BUS architecture (Fact and Dimension tables).
- Ensure models align with business requirements and provide scalability, performance, and maintainability.
2. SQL Development:
- Implement data models in SQL using best practices for view creation, ensuring high performance.
- Write, optimize, and refactor complex SQL queries for efficiency and performance in large-scale databases.
- Develop solutions adhering to Information Mart and Data Vault 2.0 standards. (Dimensional model that is built using Raw Data vault tables Hubs, Links, satellites, Effectivity satellites , Bridge and PIT tables from Data Vault.)
3. Data Analysis & Relationship Metrics:
- Perform in-depth data analysis to identify patterns, relationships, and metrics at different levels of granularity.
- Ensure data integrity and quality by validating data models against business expectations.
4. Performance Optimization:
- Conduct performance tuning of existing data structures, queries, and ETL processes.
- Provide guidance on database indexing, partitioning, and query optimization techniques.
5. Collaboration:
- Work closely with business stakeholders, data engineers, and analysts to understand and translate business needs into effective data solutions.
- Support cross-functional teams to ensure seamless integration and delivery of data solutions
Temporary full time role
Job requirement - Sourcing and TA for top brand startups in India and around the world.
Need experience and background in HR, TA & outbound search/sourcing
prior experience with tech hiring for startups is highly preferred.
We are seeking free-lance recruiters. Consequently, we provide flexible work arrangements which accomodates for multiple jobs.
Catalyst IQ is a new-age Recrutiment and Advisory startup which brings a ledership from international and local business exposure. We are trying to create an effective marketplace for free-lance recrutiers. We specialise in mandates for edtech, Agritech, Fintech (neo bank, DeFi and Accounting ERP), e-gamming and others.
value proposition - Catalyst IQ provide the potential to for someone to earn a months salary that a typical HR recruiter with 0-4 years exp makes in a full-time role, by just working part-time for us.
Connect with us to learn more.
- Strong experience in Java programming.
- Must have experience on MicroServices using Spring boots /Jersy /Swagger / any other Microservices technology stack
- Good experience in either Spring or Hibernate
- Must have at least 1to 2 years’ experience in Web application
- Knowledge of OOP concepts, industry best practices, and design
- Good understanding of web technology/enterprise-level applications
- Good to have experience in JavaScript frameworks
- Good to have experience in Agile Methodology
- Self-motivated and a Quick Learner
- Creative ideas with a problem-solving mindset.
- Ability to consistently perform and meet deadlines
- Attention to detail and follow-through
- A good understanding of customer satisfaction
- Ability to work effectively in a team as well as in an individual environment
- Excellent written and verbal communication skills

- Skills: Angular 2+, JavaScript, HTML, CSS.
- Experience: 5-9 years.
- Experience with JavaScript libraries and frameworks.
- Should have strong sense of web design and attuned to the fundamentals of user experience.
- Should able to design, build and maintain efficient, reusable, and reliable front-end code.
- 5-6 years of proven excellence in IT- Business Development or inside sales.
- Mandate experience in Appointment setting.
- Must have B2B sales experience.
- Shift timings- 5:30pm to 2;30 am IST(8:00 am to 5:00 pm US-EST)
- 6:30 pm to 3:30 am IST ( 9:00am to 6:00pm US- EST)
- Tool Handling - LinkedIn Sales Navigator, Hubspot, Salesforce, Outreach etc
- Location - Remote (PAN INDIA)
- 3+ years demonstrated experience in the Testing Web & mobile applications.
- 1+ Years of experience in Js based Web application testing. (Like: Angular Js, Vue Js, react js)
- Create and maintain API-based automation test scripts using tools like J meter, SoapUI or Postman
- API Automation using Soap UI/Rest assured / Postman (Mandatory)
- Create and maintain API specific test plans
- Conducting Performance and Load tests on the API's and provide detailed report and analysis on metrics like Breakage point etc.
- Participate in creating and clarifying User Stories, and in planning Sprints.
- Maintaining all the testing artefacts in SharePoint.
- Updating test management tools (JIRA / Bugzilla etc) so that the current status of the project can be known at any time by stakeholders.
- Issue tracking, analysis and reporting.
- Excellent verbal and written communication skills.
A. Strong passion for Programming in general and Android App development in specific.
B. Strong problem-solving skills.
C. Strong system design and architecture skills - specifically for android.
D. Curiosity to tinker around, explore new paradigms and strong zest for continuous improvement.
E. Over 4+ years of Android App development experience with strong basics and complete exposure to Android development.
F. Idea/experience of unit and instrumentation testing in Android.
G. E2E App development and/or experience of developing SDKs is good to have.







