Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
at Exusia

About Exusia
About
Connect with the team
Similar jobs

Only candidates currently in Bihar or Open to relocate to Bihar, please apply:
Job Description:
This is an exciting opportunity for an experienced industry professional with strong analytical and technical skills to join and add value to a dedicated and friendly team. We are looking for a Data Analyst who is driven by data-driven decision-making and insights. As a core member of the Analytics Team, the candidate will take ownership of data analysis projects by working independently with little supervision.
The ideal candidate is a highly resourceful and innovative professional with extensive experience in data analysis, statistical modeling, and data visualization. The candidate must have a strong command of data analysis tools like SAS/SPSS, Power BI/Tableau, or R, along with expertise in MS Excel and MS PowerPoint. The role requires optimizing data collection procedures, generating reports, and applying statistical techniques for hypothesis testing and data interpretation.
Key Responsibilities:
• Perform data analysis using tools like SAS, SPSS, Power BI, Tableau, or R.
• Optimize data collection procedures and generate reports on a weekly, monthly, and quarterly basis.
• Utilize statistical techniques for hypothesis testing to validate data and interpretations.
• Apply data mining techniques and OLAP methodologies for in-depth insights.
• Develop dashboards and data visualizations to present findings effectively.
• Collaborate with cross-functional teams to define, design, and execute data-driven strategies.
• Ensure the accuracy and integrity of data used for analysis and reporting.
• Utilize advanced Excel skills to manipulate and analyze large datasets.
• Prepare technical documentation and presentations for stakeholders.
Candidate Profile:
Required Qualifications:
• Qualification: MCA / Graduate / Post Graduate in Statistics or MCA or BE/B.Tech in Computer Science & Engineering, Information Technology, or Electronics.
• A minimum of 2 years' experience in data analysis using SAS/SPSS, Power BI/Tableau, or R.
• Proficiency in MS Office with expertise in MS Excel & MS PowerPoint.
• Strong analytical skills with attention to detail.
• Experience in data mining and OLAP methodologies.
• Ability to generate insights and reports based on data trends.
• Excellent communication and presentation skills.
Desired Qualifications:
• Experience in predictive analytics and machine learning techniques.
• Knowledge of SQL and database management.
• Familiarity with Python for data analysis.
• Experience in automating reporting processes.


We are seeking a passionate and knowledgeable Data Science and Data Analyst Trainer to deliver engaging and industry-relevant training programs. The trainer will be responsible for teaching core concepts in data analytics, machine learning, data visualization, and related tools and technologies. The ideal candidate will have hands-on experience in the data domain with 2-5 years and a flair for teaching and mentoring students or working professionals.

We are looking for a Business Intelligence (BI)/Data Analyst to create and manage Power Bl and analytics solutions that turn data into knowledge. In this role, you should have a background in data and business analysis. If you are self-directed, passionate about data,
and have business acumen and problem-solving aptitude, we'd like to meet you. Ultimately, you will enhance our business intelligence system to help us make better decisions.
Requirements and Qualifications
- BSc/BA in Computer Science, Engineering, or relevant field.
- Financial experience and Marketing background is a plus
- Strong Power BI development skills including Migration of existing deliverables to PowerBl.
- Ability to work autonomously
- Data modelling, Calculations, Conversions, Scheduling Data refreshes in Power-BI.
- Proven experience as a Power BI Developer is a must.
- Industry experience is preferred. Familiarity with other BI tools (Tableau, QlikView).
- Analytical mind with a problem-solving aptitude.
Responsibilities
- Design, develop and maintain business intelligence solutions
- Craft and execute queries upon request for data
- Present information through reports and visualization based on requirements gathered from stakeholders
- Interact with the team to gain an understanding of the business environment, technical context, and organizational strategic direction
- Design, build and deploy new, and extend existing dashboards and reports that synthesize distributed data sources
- Ensure data accuracy, performance, usability, and functionality requirements of BI platform
- Manage data through MS Excel, Google sheets, and SQL applications, as required and support other analytics platforms
- Develop and execute database queries and conduct analyses
- Develop and update technical documentation requirements
- Communicate insights to both technical and non-technical audiences.
Job Description
- Solid technical skills with a proven and successful history working with data at scale and empowering organizations through data
- Big data processing frameworks: Spark, Scala, Hadoop, Hive, Kafka, EMR with Python
- Advanced experience and hands-on architecture and administration experience on big data platforms
Data Analyst
Job Description
Summary
Are you passionate about handling large & complex data problems, want to make an impact and have the desire to work on ground-breaking big data technologies? Then we are looking for you.
At Amagi, great ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day-to-day basis? If so, Amagi’s Data Engineering and Business Intelligence team is looking for passionate, detail-oriented, technical savvy, energetic team members who like to think outside the box.
Amagi’s Data warehouse team deals with petabytes of data catering to a wide variety of real-time, near real-time and batch analytical solutions. These solutions are an integral part of business functions such as Sales/Revenue, Operations, Finance, Marketing and Engineering, enabling critical business decisions. Designing, developing, scaling and running these big data technologies using native technologies of AWS and GCP are a core part of our daily job.
Key Qualifications
- Experience in building highly cost optimised data analytics solutions
- Experience in designing and building dimensional data models to improve accessibility, efficiency and quality of data
- Experience (hands on) in building high quality ETL applications, data pipelines and analytics solutions ensuring data privacy and regulatory compliance.
- Experience in working with AWS or GCP
- Experience with relational and NoSQL databases
- Experience to full stack web development (Preferably Python)
- Expertise with data visualisation systems such as Tableau and Quick Sight
- Proficiency in writing advanced SQL queries with expertise in performance tuning handling large data volumes
- Familiarity with ML/AÍ technologies is a plus
- Demonstrate strong understanding of development processes and agile methodologies
- Strong analytical and communication skills. Should be self-driven, highly motivated and ability to learn quickly
Description
Data Analytics is at the core of our work, and you will have the opportunity to:
- Design Data-warehousing solutions on Amazon S3 with Athena, Redshift, GCP Bigtable etc
- Lead quick prototypes by integrating data from multiple sources
- Do advanced Business Analytics through ad-hoc SQL queries
- Work on Sales Finance reporting solutions using tableau, HTML5, React applications
We build amazing experiences and create depth in knowledge for our internal teams and our leadership. Our team is a friendly bunch of people that help each other grow and have a passion for technology, R&D, modern tools and data science.
Our work relies on deep understanding of the company needs and an ability to go through vast amounts of internal data such as sales, KPIs, forecasts, Inventory etc. One of the key expectations of this role would be to do data analytics, building data lakes, end to end reporting solutions etc. If you have a passion for cost optimised analytics and data engineering and are eager to learn advanced data analytics at a large scale, this might just be the job for you..
Education & Experience
A bachelor’s/master’s degree in Computer Science with 5 to 7 years of experience and previous experience in data engineering is a plus.

JOB SUMMARY: The Senior Associate supports the Data Analytics Manager by proposing relevant analytics procedures/tools, executing the analytics and also developing visualization outputs for audits, continuous monitoring/auditing and IA initiatives. The individual’s responsibilities include -
Understanding audit and/or project objectives and assisting the manager in preparing the plan and timelines.
Working with the Process/BU/IA teams for gathering requirements for continuous monitoring/auditing projects.
Working with Internal audit project teams to understand the analytics requirements for audit engagements.
Independently build pilot/prototype, determine appropriate visual tool and design the views to meet project objectives.
Proficient in data management and data mining.
Highly skilled on visualization tools like Qlik View, Qlik Sense, Power BI, Tableau, Alteryx etc.
Working with Data Analytics Manager to develop analytics program aligned to the overall audit plan.
Showcasing analytics capability to Process management teams to increase adoption of continuous monitoring.
Establishing and maintaining relationships with all key stakeholders of internal audit.
Coaching other data analysts on analytics procedures, coding and tools.
Taking a significant and active role in developing and driving Internal Audit Data Analytics quality and knowledge sharing to enhance the value provided to Internal Audit stakeholders.
Ensuring timely and accurate time tracking.
Continuously focusing on self-development by attending trainings, seminars and acquiring relevant certifications.



Work Location : Chennai
Experience Level : 5+yrs
Package : Upto 18 LPA
Notice Period : Immediate Joiners
It's a full-time opportunity with our client.
Mandatory Skills:Machine Learning,Python,Tableau & SQL
Job Requirements:
--2+ years of industry experience in predictive modeling, data science, and Analysis.
--Experience with ML models including but not limited to Regression, Random Forests, XGBoost.
--Experience in an ML engineer or data scientist role building and deploying ML models or hands on experience developing deep learning models.
--Experience writing code in Python and SQL with documentation for reproducibility.
--Strong Proficiency in Tableau.
--Experience handling big datasets, diving into data to discover hidden patterns, using data visualization tools, writing SQL.
--Experience writing and speaking about technical concepts to business, technical, and lay audiences and giving data-driven presentations.
--AWS Sagemaker experience is a plus not required.





Be Part Of Building The Future
Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems.
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 8+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team



