Cutshort logo
Exusia logo
Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
Exusia's logo

Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.

Dhaval Upadhyay's profile picture
Posted by Dhaval Upadhyay
1 - 15 yrs
₹5L - ₹10L / yr
Pune, Chicago, Hyderabad, New York
Skills
Abinitio
Cognos
Microstrategy
Business Analysts
Hadoop
Informatica PowerCenter
Tableau
Exusia, Inc. (ex-OO-see-ah: translated from Greek to mean "Immensely Powerful and Agile") was founded with the objective of addressing a growing gap in the data innovation and engineering space as the next global leader in big data, analytics, data integration and cloud computing solutions. Exusia is a multinational, delivery centric firm that provides consulting and software as a service (SaaS) solutions to leading financial, government, healthcare, telecommunications and high technology organizations facing the largest data volumes and the most complex information management requirements. Exusia was founded in the United States in 2012 with headquarters in New York City and regional US offices in Chicago, Atlanta and Los Angeles. Exusia’s international presence continues to expand and is driven from Toronto (Canada), Sao Paulo (Brazil), Johannesburg (South Africa) and Pune (India). Our mission is to empower clients to grow revenue, optimize costs and satisfy regulatory requirements through the innovative use of information and analytics. We leverage a unique blend of strategy, intellectual property, technical execution and outsourcing to enable our clients to achieve significant returns on investment for their business, data and technology initiatives. At the core of our philosophy is a quality-first, trust-building, delivery-focused client relationship. The foundation of this relationship is the talent of our team. By recruiting and retaining the best talent in the industry, we are able to deliver to clients, whose data volumes and requirements number among the largest in the world, a broad range of customized, cutting edge solutions.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About Exusia

Founded :
2012
Type :
Services
Size :
100-1000
Stage :
Profitable

About

Exusia is a multinational firm that provides consulting and software as a service solutions to leading organizations in healthcare, finance, telecommunications, consumer products, hospitality, supply chain, and high technology industries. It addresses the growing gap in the strategy and data engineering space as the next global leader in analytics, data engineering, and cloud computing solutions. Exusia is ISO 27001 certified and offers managed services to organizations facing the largest data volumes and the most complex data engineering requirements. Exusia was founded in 2012 in New York City and has its Americas headquarters in Miami, European headquarters in London, Africa headquarters in Johannesburg, and Asia headquarters in Pune. It has delivery centers in Pune, Gurugram, Chennai, Hyderabad, and Bangalore. Industries in which the company operates include healthcare, finance, telecommunications, consumer products, hospitality, supply chain, and high technology.
Read more

Connect with the team

Profile picture
Dhaval Upadhyay

Company social profiles

linkedintwitterfacebook

Similar jobs

AI Industry
AI Industry
Agency job
via Peak Hire Solutions by Dhara Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
5 - 17 yrs
₹34L - ₹45L / yr
Dremio
Data engineering
Business Intelligence (BI)
Tableau
PowerBI
+51 more

Review Criteria:

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Role & Responsibilities:

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
Skellam AI
Sherly P
Posted by Sherly P
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹24L / yr
MicroStrategy
Tableau

We are looking for a skilled BI Engineer to join our team. The ideal candidate will have expertise in MicroStrategy, SQL, Reporting and Data Modeling along with experience in building insightful dashboards.

Key Responsibilities:

●Design, develop, and deploy MicroStrategy dashboards, reports, and visualizations.

●Write complex SQL queries to extract, manipulate, and analyse data from various sources.

●Develop and optimize data warehouse solutions for efficient data storage and retrieval.

●Collaborate with cross-functional teams to understand business requirements and deliver data solutions.

●Maintain and support MicroStrategy objects (attributes, facts, filters, prompts)

●Ensure data accuracy, consistency, and performance across reports and dashboards.

●Automate reporting processes and improve data visualization techniques.

●Troubleshoot data issues and optimize queries for better performance.

●Collaborate with business teams to understand reporting needs and translate them into visual insights.

●Work closely with business analysts, data engineers, and stakeholders to gather requirements and ensure solutions meet business needs.

Required Skills & Qualifications:

●5+ years of experience in BI engineering or data analytics.

●Strong expertise in MicroStrategy for dashboard, Data Modeling and report development.

●Knowledge of MicroStrategy architecture, Development and administration tools.

●Strong understanding of data modelling

●Perform data modeling and ETL validation to ensure accuracy of reports.

●Proficiency in SQL for querying and data manipulation.

●Experience working with data warehousing concepts and technologies (Redshift, Snowflake, BigQuery, etc.).

●Ability to process large datasets and optimize queries for performance.

●Strong analytical and problem-solving skills.

Preferred Qualifications:

MicroStrategy certification will be considered an added advantage.

Familiarity with the Tableau tool

Read more
KGISL MICROCOLLEGE
Agency job
via EDU TECH by Srimathi Balamurugan
Remote, Kochi (Cochin)
1 - 5 yrs
₹2L - ₹6L / yr
Business Analysis
SQL
MS-Excel
Tableau
PowerBI

We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.

Read more
Institutional-grade tools to understand digital assets
Institutional-grade tools to understand digital assets
Agency job
via Qrata by Blessy Fernandes
Bengaluru (Bangalore), Coimbatore
3 - 7 yrs
₹20L - ₹30L / yr
Business Analysis
skill iconPython
SQL
Web3js
Tableau
+1 more

Qualifications

● Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent practical

experience (or) degree in an analytical field (e.g. Computer Science, Engineering, Mathematics,

Statistics, Operations Research, Management Science)

● 3+ years experience with data analysis and metrics development

● 3+ years experience analyzing and interpreting data, drawing conclusions, defining

recommended actions, and reporting results across stakeholders

● 2+ years experience writing SQL queries

● 2+ years experience scripting in Python

● Demonstrated curiosity in and excitement for Web3/blockchain technologies

● Interested in learning new technologies to solve customer needs with lots of creative freedom

● Strong communication skills and business acumen

● Self-starter, motivated by an interest for developing the best possible solutions to problems

● Experience with Google Cloud - Bigquery, DataBricks stack, DBT, Tableu, Jupyter is a plus

Read more
Hyderabad
3 - 6 yrs
₹10L - ₹16L / yr
SQL
Spark
Analytical Skills
Hadoop
Communication Skills
+4 more

The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.


Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.


Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.


Actively participates with other consultants in problem-solving and approach development.


Responsibilities :


Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.


Perform data analysis to validate data models and to confirm the ability to meet business needs.


Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.


Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.


Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.


Coordinate with Data Architects, Program Managers and participate in recurring meetings.


Help and mentor team members to understand the data model and subject areas.


Ensure that the team adheres to best practices and guidelines.


Requirements :


- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.


- Experience with Spark optimization/tuning/resource allocations


- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.


- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).


- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.


- Have a deep understanding of the various stacks and components of the Big Data ecosystem.


- Hands-on experience with Python is a huge plus

Read more
Bidgely
at Bidgely
4 candid answers
2 recruiters
Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
6 - 10 yrs
₹20L - ₹40L / yr
SDET
Automation
Test Automation (QA)
Selenium
TestNG
+18 more
Responsibilities

  • Design and develop a framework, internal tools, and scripts for testing large-scale data systems, machine learning algorithms, and responsive User Interfaces.
  • Create repeatability in testing through automation
  • Participate in code reviews, design reviews, architecture discussions.
  • Performance testing and benchmarking of Bidgely product suites
  • Driving the adoption of these best practices around coding, design, quality, performance in your team.
  • Lead the team on all technical aspects and own the quality of your teams’ deliverables
  • Understand requirements, design exhaustive test scenarios, execute manual and automated test cases, dig deeper into issues, identify root causes, and articulate defects clearly.
  • Strive for excellence in quality by looking beyond obvious scenarios and stated requirements and by keeping end-user needs in mind.
  • Debug automation, product, deployment, and production issues and work with stakeholders/team on quick resolution
  • Deliver a high-quality robust product in a fast-paced start-up environment.
  • Collaborate with the engineering team and product management to elicit & understand their requirements and develop potential solutions.
  • Stay current with the latest technology, tools, and methodologies; share knowledge by clearly articulating results and ideas to key decision-makers.

Requirements

  •  BS/MS in Computer Science, Electrical or equivalent
  • 6+ years of experience in designing automation frameworks, tools
  • Strong object-oriented design skills, knowledge of design patterns, and an uncanny ability to
    design intuitive module and class-level interfaces
  • Deep understanding of design patterns, optimizations
  • Experience leading multi-engineer projects and mentoring junior engineers
  • Good understanding of data structures and algorithms and their space and time complexities. Strong technical aptitude and a good knowledge of CS fundamentals
  • Experience in non-functional testing and performance benchmarking
  • Knowledge of Test-Driven Development &  implementing CD/CD
  • Strong hands-on and practical working experience with at least one programming language: Java/Python/C++
  • Strong analytical, problem solving, and debugging skills.
  • Strong experience in API automation using Jersey/Rest Assured.
  • Fluency in automation tools, frameworks such as Selenium, TestNG, Jmeter, JUnit, Jersey, etc...
  • Exposure to distributed systems or web applications
  • Good in RDBMS or any of the large data systems such as Hadoop, Cassandra, etc.
  • Hands-on experience with build tools like Maven/Gradle &  Jenkins
  • Experience in testing on various browsers and devices.
  • Strong communication and collaboration skills.
Read more
Kaplan
at Kaplan
6 recruiters
Akshata Ranka
Posted by Akshata Ranka
Bengaluru (Bangalore)
7 - 10 yrs
₹15L - ₹20L / yr
Statistical Analysis
Data mining
Data Visualization
skill iconData Science
skill iconR Programming
+7 more

Senior Data Scientist-Job Description

The Senior Data Scientist role is a creative problem solver who utilizes statistical/mathematical principles and modelling skills to uncover new insights that will significantly and meaningfully impact business decisions and actions.  She/he applies their data science expertise in identifying, defining, and executing state-of-art techniques for academic opportunities and business objectives in collaboration with other Analytics team members. The Senior Data Scientist will execute analyses & outputs spanning test design and measurement, predictive analytics, multivariate analysis, data/text mining, pattern recognition, artificial intelligence, and machine learning.

 

Key Responsibilities:

  • Perform the full range of data science activities including test design and measurement, predictive/advanced analytics, and data mining, and analytic dashboards.
  • Extract, manipulate, analyse & interpret data from various corporate data sources developing advanced analytic solutions, deriving key observations, findings, insights, and formulating actionable recommendations.
  • Generate clearly understood and intuitive data science / advanced analytics outputs.
  • Provide thought leadership and recommendations on business process improvement, analytic solutions to complex problems.
  • Participate in best practice sharing and communication platform for advancement of the data science discipline.
  • Coach and collaborate with other data scientists and data analysts.
  • Present impact, insights, outcomes & recommendations to key business partners and stakeholders.
  • Comply with established Service Level Agreements to ensure timely, high quality deliverables with value-add recommendations, clearly articulated key findings and observations.

Qualification:

  • Bachelor's Degree (B.A./B.S.) or Master’s Degree (M.A./M.S.) in Computer Science, Statistics, Mathematics, Machine Learning, Physics, or similar degree
  • 5+ years of experience in data science in a digitally advanced industry focusing on strategic initiatives, marketing and/or operations.
  • Advanced knowledge of best-in-class analytic software tools and languages: Python, SQL, R, SAS, Tableau, Excel, PowerPoint.
  • Expertise in statistical methods, statistical analysis, data visualization, and data mining techniques.
  • Experience in Test design, Design of Experiments, A/B Testing, Measurement Science Strong influencing skills to drive a robust testing agenda and data driven decision making for process improvements
  • Strong Critical thinking skills to track down complex data and engineering issues, evaluate different algorithmic approaches, and analyse data to solve problems.
  • Experience in partnering with IT, marketing operations & business operations to deploy predictive analytic solutions.
  • Ability to translate/communicate complex analytical/statistical/mathematical concepts with non-technical audience.
  • Strong written and verbal communications skills, as well as presentation skills.

 

Read more
Tata Digital Pvt Ltd
Tata Digital Pvt Ltd
Agency job
via Seven N Half by Priya Singh
Mumbai, Mangalore, Gurugram
5 - 11 yrs
₹1L - ₹15L / yr
SOA
EAI
ESB
J2EE
RESTful APIs
+14 more

Role / Purpose - Lead Developer - API and Microservices

Must have a strong hands-on development track record building integration utilizing a variety of integration products, tools, protocols, technologies, and patterns.

  • Must have an in-depth understanding of SOA/EAI/ESB concepts, SOA Governance, Event-Driven Architecture, message-based architectures, file sharing, and exchange platforms, data virtualization and caching strategies, J2EE design patterns, frameworks
  • Should possess experience with at least one of middleware technologies (Application Servers, BPMS, BRMS, ESB & Message Brokers), Programming languages (e.g. Java/J2EE, JavaScript, COBOL, C), Operating Systems (e.g. Windows, Linux, MVS), and Databases (DB2, MySQL, No SQL Databases like MongoDB, Cassandra, Hadoop, etc.)
  • Must have experience implementing API Service architectures (SOAP, REST) using any of the market-leading API Management tools such as Apigee and frameworks such as Spring Boot for Microservices
  • Should have Advanced skills in implementing API Service architectures (SOAP, REST) using any of the market-leading API Management tools such as Apigee or similar frameworks such as Spring Boot for Microservices 
  • Appetite to manage large-scale projects and multiple tracks
  •  Experience and knowhow of the e-commerce domain and retail experience are preferred
  •  Good communication & people managerial skills
Read more
JetSynthesys Pvt. Ltd.
at JetSynthesys Pvt. Ltd.
1 recruiter
Agency job
via Jobdost by Mamatha A
Pune
5 - 10 yrs
₹7L - ₹13L / yr
Accounting
Finance
PowerBI
Tableau
Management Information System (MIS)
+4 more

Job Description:

  • Bookkeeping and accounting in Tally ERP, Xero, QuickBooks, and applicable accounting software
  • Responsible for preparation and management of books of accounts, records, and documents for foreign entities
  • Preparation and reporting of Monthly/periodical MIS.
  • Managing billing, receivables, and collection.
  • Liaising with foreign consultants with respect to Bookkeeping, compliances
  • Ensure compliance under various laws for payroll and non-payroll compliances.
  • Managing Audits of the offshore entities under different statutes (GST/Sales Tax, Companies House)
  • Managing payroll and payroll compliances
  • Managing Banking operations and payments and operational fund flow/cash flow. 
  • Desired Candidate Profile:
  • Must have good communication skills to deal with foreign clients.
  • Should have good knowledge of MS office and tally.
  • Experience in Corporate Reporting, MIS, Power BI and Tableau etc.
Read more
They platform powered by machine learning. (TE1)
They platform powered by machine learning. (TE1)
Agency job
via Multi Recruit by Paramesh P
Bengaluru (Bangalore)
1.5 - 4 yrs
₹8L - ₹16L / yr
skill iconScala
skill iconJava
Spark
Hadoop
Rest API
+1 more
  • Involvement in the overall application lifecycle
  • Design and develop software applications in Scala and Spark
  • Understand business requirements and convert them to technical solutions
  • Rest API design, implementation, and integration
  • Collaborate with Frontend developers and provide mentorship for Junior engineers in the team
  • An interest and preferably working experience in agile development methodologies
  • A team player, eager to invest in personal and team growth
  • Staying up to date with cutting edge technologies and best practices
  • Advocate for improvements to product quality, security, and performance

 

Desired Skills and Experience

  • Minimum 1.5+ years of development experience in Scala / Java language
  • Strong understanding of the development cycle, programming techniques, and tools.
  • Strong problem solving and verbal and written communication skills.
  • Experience in working with web development using J2EE or similar frameworks
  • Experience in developing REST API’s
  • BE in Computer Science
  • Experience with Akka or Micro services is a plus
  • Experience with Big data technologies like Spark/Hadoop is a plus company offers very competitive compensation packages commensurate with your experience. We offer full benefits, continual career & compensation growth, and many other perks.

 

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos