

Auxo AI
https://auxoai.comAbout
Company social profiles
Jobs at Auxo AI
AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 4 - 10 years of prior experience in data engineering, with a strong background in AWS (Amazon Web Services) technologies. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.
Experience : 4 - 10years
Notice : Immediate to 15days
Responsibilities :
- Design, develop, and maintain scalable data pipelines and ETL processes leveraging AWS services such as S3, Glue, EMR, Lambda, and Redshift.
- Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.
- Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness.
- Implement data governance and security best practices to ensure compliance and data integrity.
- Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring.
- Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.
Qualifications :
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 4-10 years of prior experience in data engineering, with a focus on designing and building data pipelines.
- Proficiency in AWS services, particularly S3, Glue, EMR, Lambda, and Redshift.
- Strong programming skills in languages such as Python, Java, or Scala.
- Experience with SQL and NoSQL databases, data warehousing concepts, and big data technologies.
- Familiarity with containerization technologies (e.g., Docker, Kubernetes) and orchestration tools (e.g., Apache Airflow) is a plus.
AuxoAI seeking a skilled and experienced Power BI Specialist with 4-10 years of hands-on experience in developing and managing Business Intelligence (BI) solutions. As a Power BI Specialist, you will work closely with stakeholders to design, develop, and deploy interactive dashboards and reports, providing valuable insights and supporting data-driven decision-making processes. You should have a strong technical background in Power BI development, data modeling, and a deep understanding of business analytics.
Responsibilities:
Design, develop, and implement Power BI reports, dashboards, and data visualizations based on business requirements.
Create and maintain data models to ensure the accuracy and efficiency of reporting structures.
Extract, transform, and load (ETL) data from various sources (databases, flat files, web services, etc.) into Power BI.
Collaborate with business stakeholders to understand reporting requirements and deliver solutions that meet their needs.
Optimize Power BI reports and dashboards for better performance and usability.
Implement security measures such as row-level security (RLS) for reports and dashboards.
Develop and deploy advanced DAX (Data Analysis Expressions) formulas for complex data modeling and analysis.
Familiarity with SQL for data extraction and manipulation.
Strong analytical and problem-solving skills with a keen attention to detail.
Experience with Power BI Service and Power BI Workspaces for publishing and sharing reports.
Qualifications:
Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or a related field.
5+ years of experience in Power BI report development and dashboard creation.
Strong proficiency in Power BI Desktop and Power BI Service.
Experience in data modeling, including relationships, hierarchies, and DAX formulas.
Expertise in Power Query for data transformation and ETL processes.
Knowledge of business processes and ability to translate business needs into technical solutions.
Excellent communication and collaboration skills to work effectively with business teams and IT professionals.
Ability to manage multiple projects and meet deadlines in a fast-paced environment.
AuxoAI is looking for a highly motivated and detail-oriented Anaplan Modeler with foundational knowledge of Supply Chain Planning. You will work closely with Solution Architects and Senior Modelers to support the design, development, and optimization of Anaplan models for enterprise planning needs across industries.
This role offers a strong growth path toward Senior Modeler and Solution Architect roles and a chance to contribute to cutting-edge planning projects.
**Responsibilities: **
Build and maintain Anaplan models, modules, dashboards, and actions under defined specifications.
Execute data loading, validation, and basic transformation within Anaplan.
Optimize existing models for better performance and usability.
Apply Anaplan best practices and methodology in daily work.
Use Anaplan Connect or similar tools for data integration.
Assist in creating model documentation, user guides, and test plans.
Support testing and troubleshooting activities.
Develop a basic understanding of supply chain processes (demand, supply, inventory planning).
Collaborate effectively with internal teams and business stakeholders.
Participate in training and skill-building programs within the Anaplan ecosystem.
Requirements
Bachelor’s or Master’s in Engineering, Supply Chain, Operations Research, or a related field
3+ years of experience in a business/IT role with hands-on Anaplan modelling exposure
Anaplan Model Builder Certification is mandatory
Basic understanding of supply chain concepts like demand, supply, and inventory planning
Strong problem-solving and logical thinking skills
Proficiency in Excel or basic data analysis tools
Excellent attention to detail and communication skills
Enthusiasm to grow within the Anaplan and supply chain domain
AuxoAI is hiring a Finance Associate with hands-on experience in accounting, statutory compliance, and financial operations. The ideal candidate brings 1–3 years of experience from a CA firm or corporate setup, and is based in Bangalore or open to relocating.
While AuxoAI typically follows a hybrid work model (3 days onsite), this role may require working from the office 5 days a week during peak periods (month-end, audits, etc.).
Location : Bangalore
Responsibilities:
- Maintain accurate books of accounts: AP entries, GL reconciliation, BRS, and month-end closing activities
- Manage day-to-day accounting and bookkeeping operations
- Ensure timely compliance with Indian statutory requirements: TDS, GST, and Income Tax filings
- Assist with audit preparations, schedules, and reconciliation reports
- Coordinate with vendors, internal teams, and auditors
- Maintain accurate documentation for tax and financial records
- Support finance lead with MIS reporting and cost tracking
- Contribute to process improvement and automation initiatives
Requirements
- 1–3 years of relevant experience in a CA firm or corporate environment
- Strong working knowledge of accounting principles, bookkeeping, and financial operations
- Hands-on experience with Indian compliance: TDS, GST, Income Tax
- Proficient in MS Excel (vlookups, pivots, reconciliation templates, etc.)
- Strong communication and interpersonal skills
- Detail-oriented and able to work under pressure
- Comfortable with extended work hours during month-end or audit cycles
Preferred:
- Prior experience working with Zoho Books or similar accounting software
- Exposure to startups, services, or consulting environments
Responsibilities:
- Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
- Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
- Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
- Implement SQL-based transformations using Dataform (or dbt)
- Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
- Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
- Partner with solution architects and product teams to translate data requirements into technical designs
- Mentor junior data engineers and support knowledge-sharing across the team
- Contribute to documentation, code reviews, sprint planning, and agile ceremonies
Requirements
- 5+ years of hands-on experience in data engineering, with at least 2 years on GCP
- Proven expertise in BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow)
- Strong programming skills in Python and/or Java
- Experience with SQL optimization, data modeling, and pipeline orchestration
- Familiarity with Git, CI/CD pipelines, and data quality monitoring frameworks
- Exposure to Dataform, dbt, or similar tools for ELT workflows
- Solid understanding of data architecture, schema design, and performance tuning
- Excellent problem-solving and collaboration skills
Bonus Skills:
- GCP Professional Data Engineer certification
- Experience with Vertex AI, Cloud Functions, Dataproc, or real-time streaming architectures
- Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
- Exposure to Docker/Kubernetes, API integration, and infrastructure-as-code (Terraform)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Key Responsibilities
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Qualifications
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 10+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Similar companies
About the company
Actosoft is a software developing and digital marketing company that offers complete IT solutions. We are a part of the digitally-fluent force that’s improving economies, increasing businesses, and creating success stories, just like ours.
Jobs
2
About the company
A 20 Years old trusted EdTech, Jupsoft Technologies Pvt Ltd having 1000+ clients across India, offering solutions on AWS hosted application including (LMS + ERP + LMS + Digital Content) for Schools, colleges and Coaching Centers that automates all processes of all academic & non-academic operations and ensuring transparency across all departments. Best EdTech Solution for better communication & ROI.
Jobs
4
About the company
Welcome to Neogencode Technologies, an IT services and consulting firm that provides innovative solutions to help businesses achieve their goals. Our team of experienced professionals is committed to providing tailored services to meet the specific needs of each client. Our comprehensive range of services includes software development, web design and development, mobile app development, cloud computing, cybersecurity, digital marketing, and skilled resource acquisition. We specialize in helping our clients find the right skilled resources to meet their unique business needs. At Neogencode Technologies, we prioritize communication and collaboration with our clients, striving to understand their unique challenges and provide customized solutions that exceed their expectations. We value long-term partnerships with our clients and are committed to delivering exceptional service at every stage of the engagement. Whether you are a small business looking to improve your processes or a large enterprise seeking to stay ahead of the competition, Neogencode Technologies has the expertise and experience to help you succeed. Contact us today to learn more about how we can support your business growth and provide skilled resources to meet your business needs.
Jobs
330
About the company
Jobs
2
About the company
Jobs
441
About the company
Jobs
12
About the company
Integra Magna is a design and tech-first creative studio where designers, developers, and strategists collaborate to build meaningful brands and digital experiences. With 10+ years of industry experience and work across 100+ global brands, primarily in the UAE and the USA, the team focuses on end-to-end branding, UI/UX design, and website development. The studio’s work is recognised with a 4.9 rating on Clutch and an Awwwards honour, reflecting a strong culture of quality, ownership, and craft.
Jobs
3
About the company
Jobs
3
About the company
Jobs
2
About the company
Jobs
3




