

Auxo AI
https://auxoai.comAbout
Company social profiles
Jobs at Auxo AI
AuxoAI is hiring a Finance Associate with hands-on experience in accounting, statutory compliance, and financial operations. The ideal candidate brings 1–3 years of experience from a CA firm or corporate setup, and is based in Bangalore or open to relocating.
While AuxoAI typically follows a hybrid work model (3 days onsite), this role may require working from the office 5 days a week during peak periods (month-end, audits, etc.).
Location : Bangalore
Responsibilities:
- Maintain accurate books of accounts: AP entries, GL reconciliation, BRS, and month-end closing activities
- Manage day-to-day accounting and bookkeeping operations
- Ensure timely compliance with Indian statutory requirements: TDS, GST, and Income Tax filings
- Assist with audit preparations, schedules, and reconciliation reports
- Coordinate with vendors, internal teams, and auditors
- Maintain accurate documentation for tax and financial records
- Support finance lead with MIS reporting and cost tracking
- Contribute to process improvement and automation initiatives
Requirements
- 1–3 years of relevant experience in a CA firm or corporate environment
- Strong working knowledge of accounting principles, bookkeeping, and financial operations
- Hands-on experience with Indian compliance: TDS, GST, Income Tax
- Proficient in MS Excel (vlookups, pivots, reconciliation templates, etc.)
- Strong communication and interpersonal skills
- Detail-oriented and able to work under pressure
- Comfortable with extended work hours during month-end or audit cycles
Preferred:
- Prior experience working with Zoho Books or similar accounting software
- Exposure to startups, services, or consulting environments
AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3- 5 years of prior experience in data engineering, with a strong background in AWS (Amazon Web Services) technologies. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.
Experience : 3 - 5years
Notice : Immediate to 15days
Responsibilities :
Design, develop, and maintain scalable data pipelines and ETL processes leveraging AWS services such as S3, Glue, EMR, Lambda, and Redshift.
Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.
Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness.
Implement data governance and security best practices to ensure compliance and data integrity.
Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring.
Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.
Qualifications :
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
3 - 5 years of prior experience in data engineering, with a focus on designing and building data pipelines.
Proficiency in AWS services, particularly S3, Glue, EMR, Lambda, and Redshift.
Strong programming skills in languages such as Python, Java, or Scala.
Experience with SQL and NoSQL databases, data warehousing concepts, and big data technologies.
Familiarity with containerization technologies (e.g., Docker, Kubernetes) and orchestration tools (e.g., Apache Airflow) is a plus.
AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3-7 years of prior experience in data engineering, with a strong background in working on modern data platforms. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.
Location : Bangalore, Hyderabad, Mumbai, and Gurgaon
Responsibilities:
· Designing, building, and operating scalable on-premises or cloud data architecture
· Analyzing business requirements and translating them into technical specifications
· Design, develop, and implement data engineering solutions using DBT on cloud platforms (Snowflake, Databricks)
· Design, develop, and maintain scalable data pipelines and ETL processes
· Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.
· Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness
· Implement data governance and security best practices to ensure compliance and data integrity
· Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring
· Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.
Requirements
· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
· Overall 3+ years of prior experience in data engineering, with a focus on designing and building data pipelines
· Experience of working with DBT to implement end-to-end data engineering processes on Snowflake and Databricks
· Comprehensive understanding of the Snowflake and Databricks ecosystem
· Strong programming skills in languages like SQL and Python or PySpark.
· Experience with data modeling, ETL processes, and data warehousing concepts.
· Familiarity with implementing CI/CD processes or other orchestration tools is a plus.
AuxoAI seeking a skilled and experienced Power BI Specialist with 4-10 years of hands-on experience in developing and managing Business Intelligence (BI) solutions. As a Power BI Specialist, you will work closely with stakeholders to design, develop, and deploy interactive dashboards and reports, providing valuable insights and supporting data-driven decision-making processes. You should have a strong technical background in Power BI development, data modeling, and a deep understanding of business analytics.
Responsibilities:
Design, develop, and implement Power BI reports, dashboards, and data visualizations based on business requirements.
Create and maintain data models to ensure the accuracy and efficiency of reporting structures.
Extract, transform, and load (ETL) data from various sources (databases, flat files, web services, etc.) into Power BI.
Collaborate with business stakeholders to understand reporting requirements and deliver solutions that meet their needs.
Optimize Power BI reports and dashboards for better performance and usability.
Implement security measures such as row-level security (RLS) for reports and dashboards.
Develop and deploy advanced DAX (Data Analysis Expressions) formulas for complex data modeling and analysis.
Familiarity with SQL for data extraction and manipulation.
Strong analytical and problem-solving skills with a keen attention to detail.
Experience with Power BI Service and Power BI Workspaces for publishing and sharing reports.
Qualifications:
Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or a related field.
5+ years of experience in Power BI report development and dashboard creation.
Strong proficiency in Power BI Desktop and Power BI Service.
Experience in data modeling, including relationships, hierarchies, and DAX formulas.
Expertise in Power Query for data transformation and ETL processes.
Knowledge of business processes and ability to translate business needs into technical solutions.
Excellent communication and collaboration skills to work effectively with business teams and IT professionals.
Ability to manage multiple projects and meet deadlines in a fast-paced environment.
AuxoAI is looking for a highly motivated and detail-oriented Anaplan Modeler with foundational knowledge of Supply Chain Planning. You will work closely with Solution Architects and Senior Modelers to support the design, development, and optimization of Anaplan models for enterprise planning needs across industries.
This role offers a strong growth path toward Senior Modeler and Solution Architect roles and a chance to contribute to cutting-edge planning projects.
**Responsibilities: **
Build and maintain Anaplan models, modules, dashboards, and actions under defined specifications.
Execute data loading, validation, and basic transformation within Anaplan.
Optimize existing models for better performance and usability.
Apply Anaplan best practices and methodology in daily work.
Use Anaplan Connect or similar tools for data integration.
Assist in creating model documentation, user guides, and test plans.
Support testing and troubleshooting activities.
Develop a basic understanding of supply chain processes (demand, supply, inventory planning).
Collaborate effectively with internal teams and business stakeholders.
Participate in training and skill-building programs within the Anaplan ecosystem.
Requirements
Bachelor’s or Master’s in Engineering, Supply Chain, Operations Research, or a related field
3+ years of experience in a business/IT role with hands-on Anaplan modelling exposure
Anaplan Model Builder Certification is mandatory
Basic understanding of supply chain concepts like demand, supply, and inventory planning
Strong problem-solving and logical thinking skills
Proficiency in Excel or basic data analysis tools
Excellent attention to detail and communication skills
Enthusiasm to grow within the Anaplan and supply chain domain
Responsibilities:
- Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
- Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
- Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
- Implement SQL-based transformations using Dataform (or dbt)
- Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
- Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
- Partner with solution architects and product teams to translate data requirements into technical designs
- Mentor junior data engineers and support knowledge-sharing across the team
- Contribute to documentation, code reviews, sprint planning, and agile ceremonies
Requirements
- 5+ years of hands-on experience in data engineering, with at least 2 years on GCP
- Proven expertise in BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow)
- Strong programming skills in Python and/or Java
- Experience with SQL optimization, data modeling, and pipeline orchestration
- Familiarity with Git, CI/CD pipelines, and data quality monitoring frameworks
- Exposure to Dataform, dbt, or similar tools for ELT workflows
- Solid understanding of data architecture, schema design, and performance tuning
- Excellent problem-solving and collaboration skills
Bonus Skills:
- GCP Professional Data Engineer certification
- Experience with Vertex AI, Cloud Functions, Dataproc, or real-time streaming architectures
- Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
- Exposure to Docker/Kubernetes, API integration, and infrastructure-as-code (Terraform)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Key Responsibilities
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Qualifications
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 10+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Similar companies
About the company
Jobs
22
About the company
Optmyzr is a fast moving, profitable, and stable B2B SaaS startup. We have been in business for 6+ years, profitable from year 1, and our team is comprised of amazing folks with experience at Google, Microsoft, Amazon, and so on. We are a developer focussed team with end to end ownership of our features. We have offices in 4 countries - this position is for our main development center which is in Hyderabad.
Optmyzr is a product company and we do not offer outsouring services. This gives us a lot of control over the design, as well as how we as a team choose to build features. Of course - with great power comes great responsibility!
We are featured in the Inc 5000 list of fastest growing companies in the USA for 3 years in a row.
Jobs
2
About the company
Jobs
3
About the company
Shoppin’ is the fastest way to go from inspiration to checkout.
Discover fashion through an endless, AI-powered feed tailored entirely to your personal style — think the ease of scrolling on social platforms, but with everything instantly shoppable. Shoppin’ learns what you love and brings the right looks to you, even before you search.
See something you like anywhere else? Just paste a link, upload a screenshot, or describe the vibe. shoppin’ finds the exact products — across every brand and retailer — in one place.
What sets shoppin’ apart is intelligent discovery. From vibe-based searches like “barbiecore dress” or “old money polo” to universal search across TikTok, Instagram, and Pinterest, shopping finally works the way you think. With virtual try-ons, real-time price comparisons, and social wardrobes to shop with friends or save looks for later, shoppin’ removes every friction from fashion discovery.
No more screenshots.
No more lost products.
No more “where’s that from?”
With access to 250,000+ brands and 100M+ products, shoppin’ turns inspiration into action — instantly
Fashion discovery, done right.
Jobs
1
About the company
The Company
At Palcode.ai, We're on a mission to fix the massive inefficiencies in pre-construction. Think about it - in a $10 trillion industry, estimators still spend weeks analyzing bids, project managers struggle with scattered data, and costly mistakes slip through complex contracts. We're fixing this with purpose-built AI agents that work. Our platform can do “magic” to Preconstruction workflows from Weeks to Hours. It's not just about AI – it's about bringing real, measurable impact to an industry ready for change. We are backed by names like AWS for Startups, Upekkha Accelerator, and Microsoft for Startups.
Jobs
6
About the company
BPO Hirings
Jobs
15
About the company
Jobs
13
About the company
Rite KnowledgeLabs is a content-first, full-service digital agency trusted by leading global and local corporations. We specialize in building and amplifying corporate and brand reputations through thought leadership-driven digital assets. Our focus on strategy and genuine storytelling ensures impactful execution and long-term client success.
Jobs
3
About the company
Jobs
11
About the company
SimplyFI is a secure digital marketplace developed to provide both liquidity and efficiency in the import-export supply chain. SimplyFI is a B2B marketplace and one single source for both supply chain finance and trade services, powered by cutting-edge cryptography, which ensures both data privacy and auditability in real time.
SimplyFI is built on enterprise-grade Blockchain Technology to support global trade deals with the infrastructure needed to participate with a highly available and highly scalable network where trade services, which include financial, shipping, and insurance services, quality services, etc., can interoperate and collaborate in a secure way.
We also specialize in delivering end-to-end solutions for the smart connected world through IOT and AI technologies. We uncover new ways of harnessing data and generate insights to create a more holistic and better user experience with our products.
Jobs
6





