
Strong MLOps profile
Mandatory (Experience 1) - Must have 8+ years of DevOps experience and 4+ years in MLOps / ML pipeline automation and production deployments
Mandatory (Experience 2) - Must have 4+ years hands-on experience in Apache Airflow / MWAA managing workflow orchestration in production
Mandatory (Experience 3) - Must have 4+ years hands-on experience in Apache Spark (EMR / Glue / managed or self-hosted) for distributed computation
Mandatory (Experience 4) - Must have strong hands-on experience across key AWS services including EKS/ECS/Fargate, Lambda, Kinesis, Athena/Redshift, S3, and CloudWatch
Mandatory (Experience 5) - Must have hands-on Python for pipeline & automation development
Mandatory (Experience 6) - Must have 4+ years of experience in AWS cloud, with recent companies
Mandatory (Company) - Product companies preferred; Exception for service company candidates with strong MLOps + AWS depth
Preferred

Similar jobs
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Job Description for QA Engineer:
- 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
Role: SOC Analyst
Job Type: Full Time, Permanent
Location: Onsite – Delhi
Experience Required: 1-3 Yrs
Skills Required:
1) Working knowledge across various security appliances (e.g., Firewall, WAF, Web Security Appliance, Email Security Appliance, Antivirus).
2) Experience with SOC Operations tools like SIEM, NDR, EDR, UEBA, SOAR, etc.
3) Strong analytical and problem-solving skills, with a deep understanding of cybersecurity principles, attack vectors, and threat intelligence.
4) Knowledge of network protocols, security technologies, and the ability to analyze and interpret security logs and events to identify potential threats.
5) Scripting skills (e.g., Python, Bash, PowerShell) for automation and analysis purposes.
6) Skilled in evaluating and integrating inputs from people, processes, and technologies to identify effective solutions.
7) Demonstrate a thorough understanding of the interdependencies between these elements and leverages this knowledge to develop comprehensive, efficient, and sustainable problem-solving strategies.
8) Excellent communication skills to articulate complex technical concepts to non-technical stakeholders and collaborate effectively with team members.
9) Ability to prioritize and manage multiple tasks in a dynamic environment.
10) Willingness to stay updated with the latest cybersecurity trends and technologies.
Job Responsibilities:
1) Continuously monitor and Analyze security alerts and logs to identify potential incidents. Analyze network traffic patterns to detect anomalies and identify potential security breaches.
2) Implement correlation rules and create playbooks as per requirements. Continuously update and suggest new rules and playbooks based on the latest attack vectors and insights from public articles and cybersecurity reports.
3) Use security compliance and scanning solutions to conduct assessments and validate the effectiveness of security controls and policies. Suggest improvements to enhance the overall security posture.
4) Utilize deception security solutions to deceive and detect potential attackers within the network.
5) Leverage deep expertise in networking, system architecture, operating systems, virtual machines (VMs), servers, and applications to enhance cybersecurity operations.
6) Work effectively with cross-functional teams to implement and maintain robust security measures. Conduct thorough forensic analysis of security incidents to determine root causes and impact.
7) Assist with all phases of incident response. Develop and refine incident response strategies and procedures to address emerging cyber threats.
8) Perform digital forensics to understand attack vectors and impact. Swiftly respond to and mitigate security threats, ensuring the integrity and security of organizational systems and data.
9) Professionally communicate and report technical findings, security incidents, and mitigation recommendations to clients.
About Company
Innspark is the fastest-growing Deep-tech Solutions company that provides next-generation products and services in Cybersecurity and Telematics. The Cybersecurity segment provides out-of-the-box solutions to detect and respond to sophisticated cyber incidents, threats, and attacks. The solutions are powered by advanced Threat Intelligence, Machine Learning, and Artificial Intelligence that provides deep visibility of the enterprise’s security.
We have developed and implemented solutions for a wide range of customers with highly complex environments including Government Organizations, Banks & Financial institutes, PSU, Healthcare Providers, Private Enterprises.
Website: https://innspark.in/
We are looking for a goal-oriented channel sales manager for winning, maintaining, and expanding all our partner relationships within the assigned territory.
Your role will include supporting the company’s ongoing sales and growth opportunities by providing channel sales support to a region or portfolio of assigned partners or customers.
You will also be required to work with the technical engineering team whenever a customer needs customized products or services.
Responsibilities
- Bring new partners on board and maintain relationships with existing partners.
- The Channel Sales Manager will be responsible for winning, onboarding, maintaining, and expanding relations with channel partners within an assigned territory.
- The Channel Sales Manager will be responsible for understanding the overall sales strategy and implementing the necessary tactics to grow sales revenue.
- The CM represents the entire range of company products and services to partners in order to achieve partner sales objectives.
- Understand customer and business needs to cross-sell and up-sell the company’s products.
- Coordinate with other department personnel such as the support team / warehouse /procurement team and management team to deliver and meet customer/ partner expectations.
- Act as a bridge for communication between the customers/ partners and the engineering team.
- Access, clarify and validate partner needs and performances at regular intervals and maintain a high partner satisfaction rate
- Coordinate with other sales channels to avoid any potential conflicts.
- Coordinate with partner sales personnel and maximize sales.
- Manage funnels, forecast, and seize sales opportunities.
- Drive and manage sales and marketing campaigns.
Requirements ,
- A minimum of 5 years of work experience.
- Electrical and Electronics degree.
- Thorough understanding of the Automation / Electronic industry.
- Exceptional communication skills.
- Flexible to travel to partner locations.
- Solution-oriented outlook.
- Ability to build lasting relationships.
and Artificial Intelligence (AI). It is headquartered in Ahmedabad, India, having a branch office in
Pune.
We have worked on / are working on Software Engineering projects that touch upon making
full-fledged products. Starting from UI/UX aspects, responsive and blazing fast front-ends,
platform-specific applications (Android, iOS, web applications, desktop applications), very
large scale infrastructure, cutting edge machine learning, and deep learning (AI in general).
The projects/products have wide-ranging applications in finance, healthcare, e-commerce,
legal, HR/recruiting, pharmaceutical, leisure sports and computer gaming domains. All of this
is using core concepts of computer science such as distributed systems, operating systems,
computer networks, process parallelism, cloud computing, embedded systems and the
Internet of Things.
PRIMARY RESPONSIBILITIES:
● Own the design, development, evaluation and deployment of highly-scalable software
products involving front-end and back-end development.
● Maintain quality, responsiveness and stability of the system.
● Design and develop memory-efficient, compute-optimized solutions for the
software.
● Design and administer automated testing tools and continuous integration
tools.
● Produce comprehensive and usable software documentation.
● Evaluate and make decisions on the use of new tools and technologies.
● Mentor other development engineers.
KNOWLEDGE AND SKILL REQUIREMENTS:
● Mastery of one or more back-end programming languages (Python, Java, Scala, C++
etc.)
● Proficiency in front-end programming paradigms and libraries (for example : HTML,
CSS and advanced JavaScript libraries and frameworks such as Angular, Knockout,
React). - Knowledge of automated and continuous integration testing tools (Jenkins,
Team City, Circle CI etc.)
● Proven experience of platform-level development for large-scale systems.
● Deep understanding of various database systems (MySQL, Mongo,
Cassandra).
● Ability to plan and design software system architecture.
● Development experience for mobile, browsers and desktop systems is
desired.
● Knowledge and experience of using distributed systems (Hadoop, Spark)
and cloud environments (Amazon EC2, Google Compute Engine, Microsoft
Azure).
● Experience working in agile development. Knowledge and prior experience of tools
like Jira is desired.
● Experience with version control systems (Git, Subversion or Mercurial).
- MSSQL, SSIS, Performance Tuning, Data modeling.
- Experience in No SQL tools is must
- Candidate should have exposure to write scripts in Python/Node JS/MongoDB or any other tool
- Maria DB/MYSQL, Mongo DB, Github, Jira
- Understand Requirements from client and on-site team.
- Participate in preparing design and data modelling.
- Comfortable with SQL Stored procedures and queries development.
- Comfortable with SQL Stored procedures and queries development.
- Manage SQL Server through multiple product lifecycle / environments, from development to critical production systems.
- Apply best in industry standard techniques for data modeling to ensure performance, integrate and requirement.
- Ability to analyze independently problems and resolve them on time.
To generate the lead
To create dealers, distributors network
To handle channel sales
To visit retail outlets
To collect orders
To generate sales and revenue
To achieve sales target
To appoint Block Sales Managers for every blocks
To guide them and train them
To inform them their job roles and responsibilities
To promote brand in the market
To create opportunities for business development
To craete positive feedback in the market
To build a positive rapo in the market through team
To generate sales through team
To achieve sales target through team









