Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
at Exusia

About Exusia
About
Connect with the team
Similar jobs
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
We are looking for a skilled BI Engineer to join our team. The ideal candidate will have expertise in MicroStrategy, SQL, Reporting and Data Modeling along with experience in building insightful dashboards.
Key Responsibilities:
●Design, develop, and deploy MicroStrategy dashboards, reports, and visualizations.
●Write complex SQL queries to extract, manipulate, and analyse data from various sources.
●Develop and optimize data warehouse solutions for efficient data storage and retrieval.
●Collaborate with cross-functional teams to understand business requirements and deliver data solutions.
●Maintain and support MicroStrategy objects (attributes, facts, filters, prompts)
●Ensure data accuracy, consistency, and performance across reports and dashboards.
●Automate reporting processes and improve data visualization techniques.
●Troubleshoot data issues and optimize queries for better performance.
●Collaborate with business teams to understand reporting needs and translate them into visual insights.
●Work closely with business analysts, data engineers, and stakeholders to gather requirements and ensure solutions meet business needs.
Required Skills & Qualifications:
●5+ years of experience in BI engineering or data analytics.
●Strong expertise in MicroStrategy for dashboard, Data Modeling and report development.
●Knowledge of MicroStrategy architecture, Development and administration tools.
●Strong understanding of data modelling
●Perform data modeling and ETL validation to ensure accuracy of reports.
●Proficiency in SQL for querying and data manipulation.
●Experience working with data warehousing concepts and technologies (Redshift, Snowflake, BigQuery, etc.).
●Ability to process large datasets and optimize queries for performance.
●Strong analytical and problem-solving skills.
Preferred Qualifications:
MicroStrategy certification will be considered an added advantage.
Familiarity with the Tableau tool
We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.
Qualifications
● Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent practical
experience (or) degree in an analytical field (e.g. Computer Science, Engineering, Mathematics,
Statistics, Operations Research, Management Science)
● 3+ years experience with data analysis and metrics development
● 3+ years experience analyzing and interpreting data, drawing conclusions, defining
recommended actions, and reporting results across stakeholders
● 2+ years experience writing SQL queries
● 2+ years experience scripting in Python
● Demonstrated curiosity in and excitement for Web3/blockchain technologies
● Interested in learning new technologies to solve customer needs with lots of creative freedom
● Strong communication skills and business acumen
● Self-starter, motivated by an interest for developing the best possible solutions to problems
● Experience with Google Cloud - Bigquery, DataBricks stack, DBT, Tableu, Jupyter is a plus
The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus
- Design and develop a framework, internal tools, and scripts for testing large-scale data systems, machine learning algorithms, and responsive User Interfaces.
- Create repeatability in testing through automation
- Participate in code reviews, design reviews, architecture discussions.
- Performance testing and benchmarking of Bidgely product suites
- Driving the adoption of these best practices around coding, design, quality, performance in your team.
- Lead the team on all technical aspects and own the quality of your teams’ deliverables
- Understand requirements, design exhaustive test scenarios, execute manual and automated test cases, dig deeper into issues, identify root causes, and articulate defects clearly.
- Strive for excellence in quality by looking beyond obvious scenarios and stated requirements and by keeping end-user needs in mind.
- Debug automation, product, deployment, and production issues and work with stakeholders/team on quick resolution
- Deliver a high-quality robust product in a fast-paced start-up environment.
- Collaborate with the engineering team and product management to elicit & understand their requirements and develop potential solutions.
- Stay current with the latest technology, tools, and methodologies; share knowledge by clearly articulating results and ideas to key decision-makers.
Requirements
- BS/MS in Computer Science, Electrical or equivalent
- 6+ years of experience in designing automation frameworks, tools
- Strong object-oriented design skills, knowledge of design patterns, and an uncanny ability to
design intuitive module and class-level interfaces - Deep understanding of design patterns, optimizations
- Experience leading multi-engineer projects and mentoring junior engineers
- Good understanding of data structures and algorithms and their space and time complexities. Strong technical aptitude and a good knowledge of CS fundamentals
- Experience in non-functional testing and performance benchmarking
- Knowledge of Test-Driven Development & implementing CD/CD
- Strong hands-on and practical working experience with at least one programming language: Java/Python/C++
- Strong analytical, problem solving, and debugging skills.
- Strong experience in API automation using Jersey/Rest Assured.
- Fluency in automation tools, frameworks such as Selenium, TestNG, Jmeter, JUnit, Jersey, etc...
- Exposure to distributed systems or web applications
- Good in RDBMS or any of the large data systems such as Hadoop, Cassandra, etc.
- Hands-on experience with build tools like Maven/Gradle & Jenkins
- Experience in testing on various browsers and devices.
- Strong communication and collaboration skills.
Senior Data Scientist-Job Description
The Senior Data Scientist role is a creative problem solver who utilizes statistical/mathematical principles and modelling skills to uncover new insights that will significantly and meaningfully impact business decisions and actions. She/he applies their data science expertise in identifying, defining, and executing state-of-art techniques for academic opportunities and business objectives in collaboration with other Analytics team members. The Senior Data Scientist will execute analyses & outputs spanning test design and measurement, predictive analytics, multivariate analysis, data/text mining, pattern recognition, artificial intelligence, and machine learning.
Key Responsibilities:
- Perform the full range of data science activities including test design and measurement, predictive/advanced analytics, and data mining, and analytic dashboards.
- Extract, manipulate, analyse & interpret data from various corporate data sources developing advanced analytic solutions, deriving key observations, findings, insights, and formulating actionable recommendations.
- Generate clearly understood and intuitive data science / advanced analytics outputs.
- Provide thought leadership and recommendations on business process improvement, analytic solutions to complex problems.
- Participate in best practice sharing and communication platform for advancement of the data science discipline.
- Coach and collaborate with other data scientists and data analysts.
- Present impact, insights, outcomes & recommendations to key business partners and stakeholders.
- Comply with established Service Level Agreements to ensure timely, high quality deliverables with value-add recommendations, clearly articulated key findings and observations.
Qualification:
- Bachelor's Degree (B.A./B.S.) or Master’s Degree (M.A./M.S.) in Computer Science, Statistics, Mathematics, Machine Learning, Physics, or similar degree
- 5+ years of experience in data science in a digitally advanced industry focusing on strategic initiatives, marketing and/or operations.
- Advanced knowledge of best-in-class analytic software tools and languages: Python, SQL, R, SAS, Tableau, Excel, PowerPoint.
- Expertise in statistical methods, statistical analysis, data visualization, and data mining techniques.
- Experience in Test design, Design of Experiments, A/B Testing, Measurement Science Strong influencing skills to drive a robust testing agenda and data driven decision making for process improvements
- Strong Critical thinking skills to track down complex data and engineering issues, evaluate different algorithmic approaches, and analyse data to solve problems.
- Experience in partnering with IT, marketing operations & business operations to deploy predictive analytic solutions.
- Ability to translate/communicate complex analytical/statistical/mathematical concepts with non-technical audience.
- Strong written and verbal communications skills, as well as presentation skills.
Role / Purpose - Lead Developer - API and Microservices
Must have a strong hands-on development track record building integration utilizing a variety of integration products, tools, protocols, technologies, and patterns.
- Must have an in-depth understanding of SOA/EAI/ESB concepts, SOA Governance, Event-Driven Architecture, message-based architectures, file sharing, and exchange platforms, data virtualization and caching strategies, J2EE design patterns, frameworks
- Should possess experience with at least one of middleware technologies (Application Servers, BPMS, BRMS, ESB & Message Brokers), Programming languages (e.g. Java/J2EE, JavaScript, COBOL, C), Operating Systems (e.g. Windows, Linux, MVS), and Databases (DB2, MySQL, No SQL Databases like MongoDB, Cassandra, Hadoop, etc.)
- Must have experience implementing API Service architectures (SOAP, REST) using any of the market-leading API Management tools such as Apigee and frameworks such as Spring Boot for Microservices
- Should have Advanced skills in implementing API Service architectures (SOAP, REST) using any of the market-leading API Management tools such as Apigee or similar frameworks such as Spring Boot for Microservices
- Appetite to manage large-scale projects and multiple tracks
- Experience and knowhow of the e-commerce domain and retail experience are preferred
- Good communication & people managerial skills
Job Description:
- Bookkeeping and accounting in Tally ERP, Xero, QuickBooks, and applicable accounting software
- Responsible for preparation and management of books of accounts, records, and documents for foreign entities
- Preparation and reporting of Monthly/periodical MIS.
- Managing billing, receivables, and collection.
- Liaising with foreign consultants with respect to Bookkeeping, compliances
- Ensure compliance under various laws for payroll and non-payroll compliances.
- Managing Audits of the offshore entities under different statutes (GST/Sales Tax, Companies House)
- Managing payroll and payroll compliances
- Managing Banking operations and payments and operational fund flow/cash flow.
- Desired Candidate Profile:
- Must have good communication skills to deal with foreign clients.
- Should have good knowledge of MS office and tally.
- Experience in Corporate Reporting, MIS, Power BI and Tableau etc.
- Involvement in the overall application lifecycle
- Design and develop software applications in Scala and Spark
- Understand business requirements and convert them to technical solutions
- Rest API design, implementation, and integration
- Collaborate with Frontend developers and provide mentorship for Junior engineers in the team
- An interest and preferably working experience in agile development methodologies
- A team player, eager to invest in personal and team growth
- Staying up to date with cutting edge technologies and best practices
- Advocate for improvements to product quality, security, and performance
Desired Skills and Experience
- Minimum 1.5+ years of development experience in Scala / Java language
- Strong understanding of the development cycle, programming techniques, and tools.
- Strong problem solving and verbal and written communication skills.
- Experience in working with web development using J2EE or similar frameworks
- Experience in developing REST API’s
- BE in Computer Science
- Experience with Akka or Micro services is a plus
- Experience with Big data technologies like Spark/Hadoop is a plus company offers very competitive compensation packages commensurate with your experience. We offer full benefits, continual career & compensation growth, and many other perks.









