
About 108 Collective
About
Connect with the team
Company social profiles
Similar jobs
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Develop and implement quality standards and quality control systems.
Monitor, measure, and analyze overall quality performance.
Inspect and test materials, equipment, processes, and finished products to ensure compliance with quality specifications.
Collaborate with operations managers to establish controls and implement process improvements.
Ensure workflows, processes, and products meet safety and regulatory requirements.
Investigate and troubleshoot product or production-related issues.
Develop and implement corrective actions and continuous improvement solutions.
Review and maintain codes, specifications, and quality-related documentation

Job Description: WordPress Developer & Designer
Location: Remote / Work From Home
Employment Type: Project-Based (Freelance/Contract)
Experience Required: 4+ Years
About Us
Augmentive Business 7 Solutions Pvt. Ltd. (AB7 Solutions) is a global professional services company providing outsourcing solutions in Healthcare, Business Support, and Digital & IT Services. We are expanding our digital team and looking for an experienced WordPress Developer & Designer to deliver high-quality client projects.
Role Overview
The WordPress Developer & Designer will be responsible for creating, customizing, and maintaining responsive WordPress websites with a strong focus on functionality, performance, and user experience.
Key Responsibilities
- Develop and customize WordPress websites, themes, and plugins.
- Design responsive, visually appealing layouts aligned with client branding.
- Optimize websites for speed, SEO, and mobile compatibility.
- Implement WooCommerce, payment gateways, and API integrations.
- Collaborate with clients and internal teams to deliver projects on time.
Qualifications & Skills
- Graduate in Computer Science/IT or equivalent.
- 4+ years of proven WordPress development & design experience.
- Strong knowledge of PHP, HTML, CSS, JavaScript, and MySQL.
- Expertise in Elementor, WPBakery, and WooCommerce.
- Proficiency in Figma/Adobe XD/Photoshop for design work.
- Excellent communication skills, ability to handle multiple projects.
Role Overview
We are seeking a highly skilled and experienced Senior AI Engineer with deep expertise in computer vision and architectural design. The ideal candidate will lead the development of robust, scalable AI systems, drive architectural decisions, and contribute significantly to the deployment of real-time video analytics, multi-model systems, and intelligent automation
solutions.
Key Responsibilities
Design and lead the architecture of complex AI systems in the domain of computer vision and real-time inference.
Build and deploy models for object detection, image segmentation, classification, and tracking.
Mentor and guide junior engineers on deep learning best practices and scalable software engineering.
Drive end-to-end ML pipelines: from data ingestion and augmentation to training, deployment, and monitoring.
Work with YOLO-based and transformer-based models for industrial use-cases.
Lead integration of AI systems into production with hardware, backend, and DevOps teams.
Develop automated benchmarking, annotation, and evaluation tools.
Ensure maintainability, scalability, and reproducibility of models through version control, CI/CD, and containerization.
Required Skills
Advanced proficiency in Python and deep learning frameworks (PyTorch, TensorFlow).
Strong experience with YOLO, segmentation networks (UNet, Mask R-CNN), and
tracking (Deep SORT).
Sound understanding of real-time video analytics and inference optimization.
Hands-on experience designing model pipelines using Docker, Git, MLflow, or similar tools.
Familiarity with OpenCV, NumPy, and image processing techniques.
Proficiency in deploying models on Linux systems with GPU or edge devices (Jetson, Coral)
Good to Have
Experience with multi-model orchestration, streaming inference (DeepStream), or virtual camera inputs.
Exposure to production-level MLOps practices.
Knowledge of cloud-based deployment on AWS, GCP, or DigitalOcean.
Familiarity with synthetic data generation, augmentation libraries, and 3D modeling tools.
Publications, patents, or open-source contributions in the AI/ML space.
Qualifications
B.E./B.Tech/M.Tech in Computer Science, Electrical Engineering, or related field.
4+ years of proven experience in AI/ML with a focus on computer vision and system- level design.
Strong portfolio or demonstrable projects in production environments
Hiring PHP Developers - Durgapur, West Bengal
WORK FROM OFFICE (Durgapur)
Candidate should relocate to Durgapur to do the job.
Timing: 10 AM-7 PM
Workdays: Monday-Friday (Only last Saturday of every month is working)
Candidates having experience of minimum 3 years can apply.
Responsibilities and Duties:
Can work on live projects of international clients.
Can handle multiple projects
Error-free coding.
Requirement -
Minimum 3 Years Experience.
Knowledge in Custom PHP, Codeigniter, Laravel
Knowledge in MYSQL
Knowledge in HTML5, CSS3
Knowledge in Javascript
Knowledge in Jquery
Knowledge in Ajax
Knowledge in Web API & Rest API
Knowledge in Third Party API Integration
Education:
Bachelor's (Required)
Experience:
PHP: 3 years (Required)
Laravel: 3 years (Preferred)
CodIgnighter: 3 years (Preferred)
Job Type: Full-time
Job Description
Sr Linux System Administrator
Experience: 5-10 years
Skills:
- VMware, vCenter, ESXi Administration
- RHEL/AIX OS Upgrades
- RHEL and AIX Patching
- Installation and configuration of services and software components.
- Handling hardware failures and server crashes
- Linux Filesystems (ext3 /4, xfs, zfs, etc.) management.
- LVM and NFS Filesystem management
- User Administration, PAM, LDAP
- Ansible, Git, Jenkins, CI/CD
- Networking
- Handling files, directories, and users.
- OS RHEL, Oracle Linux, Centos, UNIX AIX, others.
- AWS, Docker, and Cloud Administration optional
Shift: Willing to work in rotational shift including weekend oncall support.
There is an option for work from home for core Night shifts and Weekend Support. Extra allowances are provided for the same
Job location : Bangalore
Immediate to 30 days of Notice
Skills
AWS, Linux, VMware, Vcenter, ESXI, AIX Patching, RHEL, PAM, Ansible, Networking, OS RHEL, Oracle, Filesystems
UI/UX Designer
Experience – 0-2 years
Salary – 1.4 - 2 lacs annual CTC
Desired Candidate Profile
Should have basic knowledge of software like adobe XD, illustrator, Figma, Corel draws, etc. Basic knowledge of HTML, CSS
Creative & a team player.
Job Description
Developing web & mobile application layouts based on UI designing standards.
Designing compelling banners for social media, job openings, company branding & logo, various formats, etc. as and when required.
Understanding project needs & participating in various design projects from concept to completion.
Understanding feedback to sharpen skills and improve the design.
- Be a passionate advocate for customers and help ensure thorough and intuitive customer journeys.
- Targeting customers with better customer segmentation, hypotheses, testing with experiments, campaign analytics, communication process ,automation, and competitive benchmarking.
- Working closely with Product, Analytics, Business and Design teams
- Owning one or more key traffic channels for a category (Personal Loans,
- Home Loans, Health Insurance and Mutual Funds) and build the optimum funnel for the business.
- Conceptualize, design and execute initiatives through extensive experiments.
- Work on new product, features and scheme launches, and making the successful by targeting customers through various channels
- Analyse traffic patterns and optimize for seasonality, relevance, and selection
- Regularly develop, test and analyse new tactics that sell products across all categories and increase conversion rates, documenting the results and sharing best practices with the team
- Own projects with considerable scope and/or complexity with significant impact on customer experience
BASIC QUALIFICATIONS
- 1-3 years of experience in roles requiring high analytical skills
- Experience using data and metrics to measure impact and determine improvements
- Experience in making data-driven decisions, developing customer insights from data and creating strategies to improve conversion and sales.
- Understanding of product funnel and journeys.
- Excellent written and oral communication skills, with the ability to communicate effectively in reviews and meetings
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Author data services using a variety of programming languages
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centres and Azure regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Work in an Agile environment with Scrum teams.
- Ensure data quality and help in achieving data governance.
Basic Qualifications
- 2+ years of experience in a Data Engineer role
- Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
- Experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases
- Experience with data pipeline and workflow management tools
- Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models
- Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets
- Strong analytic skills related to working with unstructured datasets
- Build processes supporting data transformation, data structures, metadata, dependency, and workload management
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
- Experience supporting and working with cross-functional teams in a dynamic environment








