

Auxo AI
https://auxoai.comAbout
Company social profiles
Jobs at Auxo AI
Responsibilities:
Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
Implement SQL-based transformations using Dataform (or dbt)
Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
Partner with solution architects and product teams to translate data requirements into technical designs
Mentor junior data engineers and support knowledge-sharing across the team
Contribute to documentation, code reviews, sprint planning, and agile ceremonies
Requirements
2+ years of hands-on experience in data engineering, with at least 2 years on GCP
Proven expertise in BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow)
Strong programming skills in Python and/or Java
Experience with SQL optimization, data modeling, and pipeline orchestration
Familiarity with Git, CI/CD pipelines, and data quality monitoring frameworks
Exposure to Dataform, dbt, or similar tools for ELT workflows
Solid understanding of data architecture, schema design, and performance tuning
Excellent problem-solving and collaboration skills
Bonus Skills:
GCP Professional Data Engineer certification
Experience with Vertex AI, Cloud Functions, Dataproc, or real-time streaming architectures
Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
Exposure to Docker/Kubernetes, API integration, and infrastructure-as-code (Terraform)
Key Responsibilities
Architect and lead the migration roadmap from Oracle EBS R12 to Fusion ERP on OCI.
Design scalable, secure, and compliant OCI environments (Compute, VCN, IAM, Storage, HA/DR).
Drive integration using OIC, SOA, REST/SOAP APIs, ODI, and FBDI.
Partner with Finance and Operations teams to translate business needs into ERP solutions across GL, AP, AR, FA, CM, and P2P/O2C.
Define data migration, reporting, and analytics integration frameworks.
Lead vendors and internal teams through design, build, and deployment phases.
Enforce architecture standards, documentation, and governance practices.
Shape
Qualifications
Bachelor’s degree in Computer Science, Information Systems, or related field.
10+ years in Oracle ERP architecture; 5+ years in Oracle Cloud (Fusion ERP + OCI).
Proven delivery of EBS to Fusion migration or coexistence programs.
Strong understanding of financial and procurement processes.
Hands-on expertise in OCI architecture, performance optimization, and cost governance.
Strong SQL/PLSQL and integration experience (APIs, OIC, ODI).
Excellent communication and stakeholder management skills.
Preferred:
Oracle Cloud Architect and/or Fusion Financials certification.
Experience with multi-cloud or high-compliance (Hi-Tech / Manufacturing) environments.
Familiarity with DevOps and automation tools (Terraform, GitOps, CI/CD)
Key Responsibilities
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Qualifications
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 10+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Responsibilities :
- Design and develop user-friendly web interfaces using HTML, CSS, and JavaScript.
- Utilize modern frontend frameworks and libraries such as React, Angular, or Vue.js to build dynamic and responsive web applications.
- Develop and maintain server-side logic using programming languages such as Java, Python, Ruby, Node.js, or PHP.
- Build and manage APIs for seamless communication between the frontend and backend systems.
- Integrate third-party services and APIs to enhance application functionality.
- Implement CI/CD pipelines to automate testing, integration, and deployment processes.
- Monitor and optimize the performance of web applications to ensure a high-quality user experience.
- Stay up-to-date with emerging technologies and industry trends to continuously improve development processes and application performance.
Qualifications :
- Bachelors/master's in computer science or related subjects or hands-on experience demonstrating working understanding of software applications.
- Knowledge of building applications that can be deployed in a cloud environment or are cloud native applications.
- Strong expertise in building backend applications using Java/C#/Python with demonstrable experience in using frameworks such as Spring/Vertx/.Net/FastAPI.
- Deep understanding of enterprise design patterns, API development and integration and Test-Driven Development (TDD)
- Working knowledge in building applications that leverage databases such as PostgreSQL, MySQL, MongoDB, Neo4J or storage technologies such as AWS S3, Azure Blob Storage.
- Hands-on experience in building enterprise applications adhering to their needs of security and reliability.
- Hands-on experience building applications using one of the major cloud providers (AWS, Azure, GCP).
- Working knowledge of CI/CD tools for application integration and deployment.
- Working knowledge of using reliability tools to monitor the performance of the application.
Similar companies
About the company
Virtana’s AI-powered platform delivers the deepest hybrid infrastructure observability, empowering you to optimize performance, reduce costs, and mitigate risks.
Founded in 2008, more than 150 Global 2000 enterprise customers such as AstraZeneca, Dell, Apple, Geico, Costco, Nasdaq, and Boeing, have long valued Virtana's software solutions. With advanced AI-powered capabilities, Virtana helps IT teams proactively address issues, streamline operations, and transform infrastructure into strategic assets in today's rapidly evolving digital landscape.
In 2023 Virtana Corp. was named:
- Top 20 Coolest Cloud Monitoring Companies (Cloud 100) by CRN.
- Best Place to Work in the Bay Area by Comparably.
- In eight Gartner® reports for innovation in Infrastructure Monitoring and Artificial Intelligence for IT Operations (AIOps).
💰 Funding details
In the most recent funding round, Virtana raised $73 million in Series C funding in January 2022, led by Atalaya Capital Management and Elm Park Capital Management, among others. Virtana has raised a total of $192.42 million so far!
Jobs
6
About the company
We are Proximity - a global team of coders, designers, product managers, geeks and experts. We solve hard, long-term engineering problems and build cutting edge tech products.
About us
Born in 2019, Proximity Works is a global, fully distributed tech firm headquartered in San Francisco - with hubs across Mumbai, Dubai, Toronto, Stockholm, and Bengaluru. We’re in the business of solving high-stakes engineering challenges with AI-powered solutions tailored for industries like sports, media & entertainment, fintech, and enterprise platforms. From real-time game analytics and ticketing workflows to creative content generation, we help build software that serves millions every day.
About the Founders
At the helm is Hardik Jagda, CEO - a technologist with a startup DNA who brings clarity to complexity and a passion for building delightful experiences.
Milestones & Impact
- Trusted by some of the world’s biggest players - from major media & entertainment giants to one of the world’s largest cricket websites and the second-largest stock exchange in the world.
- Delivered game-changing tech: slashing content creation by 90%, doubling performance metrics for NASDAQ clients, and accelerating speed/performance wins for platforms like Dream11.
Culture & Why It Matters
- Fully distributed and flexible: work 100% remotely, design your own schedule, build habits that work for you and not the other way around.
- People-first culture: Community events, “Proxonaut battles,” monthly off-sites, and a liberal referral policy keep us connected even when we’re apart.
- High-trust environment: autonomy is encouraged. You’re empowered to act, learn fast, and iterate boldly. We know great work comes when talented people have space to think and create.
Jobs
3
About the company
Deep Tech Startup Focusing on Autonomy and Intelligence for Unmanned Systems. Guidance and Navigation, AI-ML, Computer Vision, Information Fusion, LLMs, Generative AI, Remote Sensing
Jobs
3
About the company
Beyond Seek is a team of R.A.R.E individuals who're solving impactful problems using the best tools available today!
Jobs
1
About the company
Sick of the endless waiting? Waiting on code reviews, QA feedback, or that "quick call"? At Middleware, we’re all about freeing up engineers like you to do what you love—build. We’ve created a cockpit that gives engineering leaders the insights they need to unblock teams, cut bottlenecks, and let engineers focus on impact.
Middleware is building a productivity OS for engineering teams. They provide visibility into engineering pipelines and offer actionable insights to improve software delivery processes. The platform is trusted by 500+ teams worldwide and is SOC2 certified.
Why You'll Love Working with Us 💖
We’re engineers at heart. Middleware was founded by ex-Uber and Maersk engineers who know what it’s like to be stuck in meeting loops and endless waiting. If you're here to build, to make things happen, and to change the game for engineering teams everywhere, let’s chat!
Jobs
1
About the company
Jobs
318
About the company
Jobs
4
About the company
Jobs
5




