Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
at Exusia

About Exusia
About
Connect with the team
Similar jobs
Job Title: Senior Tableau Developer
Location: Gurgaon
Experience: 4–10 Years
Salary: Negotiable
Job Summary:
We need a Senior Tableau Developer with a minimum of 4 years to join our BI team. The ideal candidate will be responsible for designing, developing, and deploying business intelligence solutions using Tableau.
Key Responsibilities:
· Design and develop interactive and insightful Tableau dashboards and visualizations.
· Optimize dashboards for performance and usability.
· Work with SQL and data warehouses (Snowflake) to fetch and prepare clean data sets.
· Gather and analyse business requirements, translate them into functional and technical specifications.
· Collaborate with cross-functional teams to understand business KPIs and reporting needs.
· Conduct unit testing and resolve data or performance issues.
· Strong understanding of data visualization principles and best practices.
Tech. Skills Required:
· Proficient in Tableau Desktop (dashboard development, storyboards)
· Strong command of SQL (joins, subqueries, CTEs, aggregation)
· Experience with large data sets and complex queries
· Experience working on any Data warehouse (Snowflake, Redshift)
· Excellent analytical and problem-solving skills.
Mail updated resume with current salary-
Email: etalenthire[at]gmail[dot]com
Satish: 88O 27 49 743



We are seeking a detail-oriented and analytical Data Analyst to collect, process, and analyze data to help drive informed business decisions. The ideal candidate will have strong technical skills, business acumen, and the ability to communicate insights effectively.




Senior Data Scientist
- 6+ years Experienced in building data pipelines and deployment pipelines for machine learning models
- 4+ years’ experience with ML/AI toolkits such as Tensorflow, Keras, AWS Sagemaker, MXNet, H20, etc.
- 4+ years’ experience developing ML/AI models in Python/R
- Must have leadership abilities to lead a project and team.
- Must have leadership skills to lead and deliver projects, be proactive, take ownership, interface with business, represent the team and spread the knowledge.
- Strong knowledge of statistical data analysis and machine learning techniques (e.g., Bayesian, regression, classification, clustering, time series, deep learning).
- Should be able to help deploy various models and tune them for better performance.
- Working knowledge in operationalizing models in production using model repositories, API s and data pipelines.
- Experience with machine learning and computational statistics packages.
- Experience with Data Bricks, Data Lake.
- Experience with Dremio, Tableau, Power Bi.
- Experience working with spark ML, spark DL with Pyspark would be a big plus!
- Working knowledge of relational database systems like SQL Server, Oracle.
- Knowledge of deploying models in platforms like PCF, AWS, Kubernetes.
- Good knowledge in Continuous integration suites like Jenkins.
- Good knowledge in web servers (Apache, NGINX).
- Good knowledge in Git, Github, Bitbucket.
- Working knowledge in operationalizing models in production using model repositories, APIs and data pipelines.
- Java, R, and Python programming experience.
- Should be very familiar with (MS SQL, Teradata, Oracle, DB2).
- Big Data – Hadoop.
- Expert knowledge using BI tools e.g.Tableau
- Experience with machine learning and computational statistics packages.
Key Responsibilities:
- Collaborate with business stakeholders and data analysts to understand reporting requirements and translate them into effective Power BI solutions.
- Design and develop interactive and visually compelling dashboards, reports, and visualizations using Microsoft Power BI.
- Ensure data accuracy and consistency in the reports by working closely with data engineers and data architects.
- Optimize and streamline existing Power BI reports and dashboards for better performance and user experience.
- Develop and maintain data models and data connections to various data sources, ensuring seamless data integration.
- Implement security measures and data access controls to protect sensitive information in Power BI reports.
- Troubleshoot and resolve issues related to Power BI reports, data refresh, and connectivity problems.
- Stay updated with the latest Power BI features and capabilities, and evaluate their potential use in improving existing solutions.
- Conduct training sessions and workshops for end-users to promote self-service BI capabilities and enable them to create their own reports.
- Collaborate with the wider data and analytics team to identify opportunities for using Power BI to enhance business processes and decision-making.
Requirements:
- Bachelor's degree in Computer Science, Information Systems, or a related field.
- Proven experience as a Power BI Developer or similar role, with a strong portfolio showcasing previous Power BI projects.
- Proficient in Microsoft Power BI, DAX (Data Analysis Expressions), and M (Power Query) to manipulate and analyze data effectively.
- Solid understanding of data visualization best practices and design principles to create engaging and intuitive dashboards.
- Strong SQL skills and experience with data modeling and database design concepts.
- Knowledge of data warehousing concepts and ETL (Extract, Transform, Load) processes.
- Ability to work with various data sources, including relational databases, APIs, and cloud-based platforms.
- Excellent problem-solving skills and a proactive approach to identifying and addressing issues in Power BI reports.
- Familiarity with data security and governance practices in the context of Power BI development.
- Strong communication and interpersonal skills to collaborate effectively with cross-functional teams and business stakeholders.
- Experience with other BI tools (e.g., Tableau, QlikView) is a plus.
The role of a Power BI Developer is critical in enabling data-driven decision-making and empowering business users to gain valuable insights from data. The successful candidate will have a passion for data visualization and analytics, along with the ability to adapt to new technologies and drive continuous improvement in BI solutions. If you are enthusiastic about leveraging the power of data through Power BI, we encourage you to apply and join our dynamic team.
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
- KSQL
- Data Engineering spectrum (Java/Spark)
- Spark Scala / Kafka Streaming
- Confluent Kafka components
- Basic understanding of Hadoop

Experience : 3 to 7 Years
Number of Positions : 20
Job Location : Hyderabad
Notice : 30 Days
1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
2. Experience in developing lambda functions with AWS Lambda
3. Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
4. Should be able to code in Python and Scala.
5. Snowflake experience will be a plus
Hadoop and Hive requirements as good to have or understanding of is enough.



- Handling Survey Scripting Process through the use of survey software platform such as Toluna, QuestionPro, Decipher.
- Mining large & complex data sets using SQL, Hadoop, NoSQL or Spark.
- Delivering complex consumer data analysis through the use of software like R, Python, Excel and etc such as
- Working on Basic Statistical Analysis such as:T-Test &Correlation
- Performing more complex data analysis processes through Machine Learning technique such as:
- Classification
- Regression
- Clustering
- Text
- Analysis
- Neural Networking
- Creating an Interactive Dashboard Creation through the use of software like Tableau or any other software you are able to use.
- Working on Statistical and mathematical modelling, application of ML and AI algorithms
What you need to have:
- Bachelor or Master's degree in highly quantitative field (CS, machine learning, mathematics, statistics, economics) or equivalent experience.
- An opportunity for one, who is eager of proving his or her data analytical skills with one of the Biggest FMCG market player.
Informatica PowerCenter (9x ,10.2) : Minimum 2+ years experience
SQL / PLSQL: Understanding of SQL procedure. Able to convert procedures into Informatica mapping.
Good to have- Advantage if you have knowledge of Windows Batch Script.





