

ADF Developer with top Conglomerates for Kochi location_ Air India
conducting F2F Interviews on 22nd April 2023
Experience - 2-12 years.
Location - Kochi only (work from the office only)
Notice period - 1 month only.
If you are interested, please share the following information at your earliest

Similar jobs

Company: Kredit Venture
About the company:
KreditVenture is seeking a Technical Product Manager to lead the development, strategy, and
execution of our SaaS applications built on top of Loan Origination Systems and Lending Platforms.
This role requires a strong technical background, a product ownership mindset, and the ability to
drive execution through both in-house teams and outsourced vendors. The ideal candidate will play
a key role in aligning business goals with technical implementation, ensuring a scalable, secure,
and user-centric platform.
Job Description
Job Title: Senior Manager / AVP / DVP – Technical Product Manager
Location: Mumbai (Ghatkopar West)
Compensation: Upto 25 LPA
Experience: 7-8 years (Designation will be based on experience)
Qualification:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- An MBA is a plus.
Roles and Responsibilities
Technology Leadership:
- Lead SaaS Platform Development – Strong expertise in full-stack development (Java, Python, MERN stack) and cloud-based architectures.
- API & Workflow Design – Drive microservices-based REST API development and implement business process automation.
- Third-Party Integrations – Enable seamless API integrations with external service providers.
- Code Quality & Best Practices – Ensure code quality, security, and performance optimization through structured audits.
Vendor & Delivery Management:
- Outsourced Vendor Oversight – Manage and collaborate with external development partners, ensuring high-quality and timely delivery.
- Delivery Governance – Define SLAs, monitor vendor performance, and proactively escalate risks.
- Quality Assurance – Ensure vendor deliverables align with product standards and integrate smoothly with internal development.
Collaboration & Stakeholder Engagement:
- Customer Insights & Feedback – Conduct user research and feedback sessions to enhance platform capabilities.
- Product Demos & GTM Support – Showcase platform features to potential clients and support sales & business development initiatives.
Platform Development & Compliance:
- Component Libraries & Workflow Automation – Develop reusable UI components and enable no-code/low-code business workflows.
- Security & Compliance – Ensure adherence to data protection, authentication, and regulatory standards (e.g., GDPR, PCI-DSS).
- Performance Monitoring & Analytics – Define KPIs and drive continuous performance optimization.

About the Company:
Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.
Why Gruve:
At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.
Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.
Position summary:
We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions.
Key Roles & Responsibilities:
- Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
- Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
- Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
- Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
- Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
- Implement data governance, security, and compliance best practices.
- Build and maintain data models, transformations, and data marts for analytics and reporting.
- Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
- Automate infrastructure and deployments using Terraform, Airflow, or dbt.
- Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
- Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.
Basic Qualifications:
- Bachelor’s or Master’s Degree in Computer Science or Data Science.
- 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
- Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
- Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
- Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
- Proficiency in SQL, Python, or Scala for data transformation and analytics.
- Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
- Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
- Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
- Strong understanding of data governance, access control, and encryption strategies.
- Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.
Preferred Qualifications:
- Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
- Experience in BI and analytics tools (Tableau, Power BI, Looker).
- Familiarity with data observability tools (Monte Carlo, Great Expectations).
- Experience with machine learning feature engineering pipelines in Databricks.
- Contributions to open-source data engineering projects.

Role & Responsibilities:
- Individuals are responsible for all traditional development activities analysis, design, coding, testing, and documentation.
- Add new features to existing Windows and Web-based applications in Visual Studio 2019. Interacting with internal team.
- Transform business and design needs into innovative products.
- Drive all projects to completion within deadlines.
- Perform thorough testing to find issues ahead of time.
- Engineering software needs more precise analysis & coding skillset
- Enhance skillset by working with fortune 500 clients
- Interacting with internal team.
Qualifications:
- E in computer science or related fields.
- Hand on development experience with web & database technologies (MS SQL).
- Excellent programming & communication skills Strong problem solving, judgmental, and decision-making skills.
- Ability to work at the tactical and strategic levels of IT initiatives.
- Impressive hands-on experience in .Net technologies. Impressive hands-on experience and knowledge with C#, SQL, JSON, XML, Win Forms & MVC, Angular, JavaScript, jQuery , HTML/CSS.
- Self-Motivated, flexible & Innovative
- Hands on Experience on OneStream or Hyperion Planning/Essbase Applications.
- Implementation experience including Building application, modules, writing Business rules is must
- Must have Vb.net scripting experience or Essbase Calculation expertise is mandatory
- Hyperion Planning & Essbase is plus
- Cloud experience is preferred (Azure, GCP)
- Hands on Experience with SQL Server and RDBMS systems.
Job description
DCT is looking for an experienced Drupal Frontend Developer who can deliver stunning visual solutions for Drupal web pages. Will work on international and local projects as well.
Responsibilities
- Develop and test Drupal site builds and themes
- Support existing Drupal websites and themes
- Perform code reviews
- Cooperate with UX/UI designers and backend developers
- Working together with a project team in an agile way
- Staying up to date with new trends in web development
Requirements
- Strong Drupal site building experience
- Experience with a wide range of Drupal modules and themes, as well as frontend frameworks
- Strong HTML5, SASS and JavaScript knowledge
- Familiarity with PHP and twig
- Ability to understand requirements and quickly identify changes to the project scope
- Understanding of common git-based development workflows and code management
- Motivation to step outside of your comfort zone and learn new technologies and systems
- At least strong intermediate English (B2)
- 3+ years of experience
Advantages
- Familiarity with PatternLab
- Experience with Drupal 8 Theming
- Understanding of frontend performance issues facing high traffic Drupal sites
- Frontend related open-source contributions
- Drupal 8 Frontend Developer or Sitebuilder certification


- Should be able to design robust backend architecture using different technologies to retrieve data from the servers.
- Creating databases and servers that are resistant to outages and work endlessly.
- Ensuring cross-platform compatibility by creating applications that work on different platforms.
- Based on the type of application the developer is responsible for the creation of API.
- The developer is responsible for building flexible applications that meet consumer requirements.
Must have:
- Angular 6+ Experience is must
- Java experience
- Web application development (with API integration) experience
- UI design and implementation skills
- Quick Learner/Passion to learn
- Good Communication skills
Good to have:
- Flutter experience is optional
PPC Manager is responsible for Developing & implementing effective PPC campaigns to hit goals
and ROI, both short term and long term. Managing PPC budgets, while building & strengthening
platform accounts & key relationships. Tracking daily, weekly & monthly KPI's to identifying
opportunities for improvement in performance & regularly reporting to management.
Responsibilities:
• Creating, developing & implementing effective paid search strategies
• Executing & optimizing many PPC campaigns simultaneously
• Overseeing campaigns across a number of search platforms (e.g. Google AdWords, Bing)
• Targeting selected audiences through researching keywords
• Managing campaign budgets & adjusting bids to optimise the ROI
• Tracking daily, weekly & monthly KPI's to identifying opportunities for improvement
• Reporting KPI's to management on a regular basis through various dashboards
• Producing engaging, clear & concise copy for campaigns
• Ensuring that campaigns are aligned across multiple channels
• Building and strengthening key relationships across PPC ad platforms and various vendors
• Reducing the risk of click fraud
• Staying current with PPC & SEM trend and techniques
Requirements:
• Minimum of 2 years' experience as a PPC Manager or Digital Marketing Specialist
• Strong analytical skills and dashboard creation
• Understanding of digital marketing concepts and SEO
• Experience with multiple platforms, e.g., AdWords, Facebook, Yahoo, Bing
• Working knowledge of Google Analytics or similar analytical tools
• Excellent communication, both verbal and written
• Analytically minded and strong in arithmetic
• BSc/BA in Marketing, Digital Media or a related field
• AdWords certification is an advantage but not essential

Position: PHP Developer
Location: Bangalore
Experience: 2-10yrs
Salary: upto12lpa Max
Notice period: Immediate to 30day (Max)
Key skill: PHP and MySQL and JavaScript






- 2+ years of experience participating in the delivery of technology services.
- A proven ability to learn new applications & innovate in technology
- Strong Experience in technologies like PHP
- Demonstrable knowledge of web technologies including HTML, CSS, Javascript, AJAX etc
- Good knowledge of relational databases, version control tools and of developing web services
- Experience in common third-party APIs (Google, Facebook, Ebay etc)
- Passion for best design and coding practices and a desire to develop new bold ideas
- Experience in integrating, designing and developing solutions is desirable

