50+ PowerBI Jobs in India
Apply to 50+ PowerBI Jobs on CutShort.io. Find your next job, effortlessly. Browse PowerBI Jobs and apply today!



Job Requirement :
- 3-5 Years of experience in Data Science
- Strong expertise in statistical modeling, machine learning, deep learning, data warehousing, ETL, and reporting tools.
- Bachelors/ Masters in Data Science, Statistics, Computer Science, Business Intelligence,
- Experience with relevant programming languages and tools such as Python, R, SQL, Spark, Tableau, Power BI.
- Experience with machine learning frameworks like TensorFlow, PyTorch, or Scikit-learn
- Ability to think strategically and translate data insights into actionable business recommendations.
- Excellent problem-solving and analytical skills
- Adaptability and openness towards changing environment and nature of work
- This is a startup environment with evolving systems and procedures, the ideal candidate will be comfortable working in a fast-paced, dynamic environment and will have a strong desire to make a significant impact on the business.
Job Roles & Responsibilities:
- Conduct in-depth analysis of large-scale datasets to uncover insights and trends.
- Build and deploy predictive and prescriptive machine learning models for various applications.
- Design and execute A/B tests to evaluate the effectiveness of different strategies.
- Collaborate with product managers, engineers, and other stakeholders to drive data-driven decision-making.
- Stay up-to-date with the latest advancements in data science and machine learning.
Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence.
The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
- Design, build, and maintain scalable data pipelines for structured and unstructured data sources
- Develop ETL processes to collect, clean, and transform data from internal and external systems
- Support integration of data into dashboards, analytics tools, and reporting systems
- Collaborate with data analysts and software developers to improve data accessibility and performance
- Document workflows and maintain data infrastructure best practices
- Assist in identifying opportunities to automate repetitive data tasks
Key Responsibilities:
Design, develop, and maintain interactive dashboards and reports using Power BI and Looker.
Write and optimize DAX queries to support complex business logic and calculations.
Collaborate with data engineers, analysts, and business stakeholders to understand reporting needs and translate them into technical solutions.
Ensure data accuracy, consistency, and performance across BI solutions.
Perform data analysis and validation to support business initiatives.
Automate and streamline reporting processes for efficiency and scalability.
Stay updated with the latest BI tools, trends, and best practices.
Required Skills & Qualifications:
Minimum 5 years of experience in BI development and data analytics.
Strong proficiency in Power BI, including DAX and Power Query.
Hands-on experience with Looker and LookML.
Solid understanding of data modeling, ETL processes, and SQL.
Ability to work with large datasets and optimize performance.
Excellent problem-solving and communication skills.
Bachelordegree in Computer Science, Information Systems, or a related field.

Data Automation Intern (Remote)
GreenTree is an asset management firm specializing in cross-border real estate investment banking in China and the USA. This is an online/remote internship. The internship is unpaid. It is up to the applicant's discretion to choose a start and end date (the dates listed on the posting are flexible). You will be given a short quiz to test your background and passion for data automation.
Responsibilities:
- ▪ Work with large datasets to clean, transform, and organize data for analysis and reporting.
- ▪ Develop and implement automated data pipelines using Excel, Power Query, Power BI, or other data tools.
- ▪ Research and recommend tools to improve data workflow efficiency.
- ▪ Collaborate with other teams to gather data requirements and support automation goals.
- ▪ Test and troubleshoot automated solutions and workflows to ensure data accuracy.
Qualifications:
- ▪ Strong understanding of Excel and data processing tools (Power Query, Power BI, or similar).
- ▪ Passion for automating data workflows and improving efficiency.
- ▪ Excellent attention to detail and problem-solving skills.
- ▪ Proficient in Microsoft Office Suite (especially Excel), and familiarity with Google Sheets is a plus.
- ▪ Self-driven and organized with strong communication skills.
- ▪ Willingness to learn new tools and apply feedback quickly.
- ▪ Ability to prioritize and manage multiple tasks independently.
While this position is unpaid, it can be used to receive course credit at partner universities.
About GreenTree:
GreenTree provides project identification, acquisition negotiation, developer liquidity, strata asset pricing, property management, and leasing solutions. GreenTree identifies and underwrites project risk, pricing, and strategies while simultaneously negotiating with developers and 3rd-party sales rep companies for projects in the USA and China.
Our value-add is project identification, acquisition negotiation, developer liquidity, strata asset pricing, property management, and leasing solutions. GreenTree identifies and underwrites project risk, pricing, and strategies while simultaneously negotiating with the developer and 3rd-party sales rep companies.
Greentree’s website: www.greentree.group



Remote Job Opportunity
Job Title: Data Scientist
Contract Duration: 6 months+
Location: offshore India
Work Time: 3 pm to 12 am
Must have 4+ Years of relevant experience.
Job Summary:
We are seeking an AI Data Scientist with a strong foundation in machine learning, deep learning, and statistical modeling to design, develop, and deploy cutting-edge AI solutions.
The ideal candidate will have expertise in building and optimizing AI models, with a deep understanding of both statistical theory and modern AI techniques. You will work on high-impact projects, from prototyping to production, collaborating with engineers, researchers, and business stakeholders to solve complex problems using AI.
Key Responsibilities:
Research, design, and implement machine learning and deep learning models for predictive and generative AI applications.
Apply advanced statistical methods to improve model robustness and interpretability.
Optimize model performance through hyperparameter tuning, feature engineering, and ensemble techniques.
Perform large-scale data analysis to identify patterns, biases, and opportunities for AI-driven automation.
Work closely with ML engineers to validate, train, and deploy the models.
Stay updated with the latest research and developments in AI and machine learning to ensure innovative and cutting-edge solutions.
Qualifications & Skills:
Education: PhD or Master's degree in Statistics, Mathematics, Computer Science, or a related field.
Experience:
4+ years of experience in machine learning and deep learning, with expertise in algorithm development and optimization.
Proficiency in SQL, Python and visualization tools ( Power BI).
Experience in developing mathematical models for business applications, preferably in finance, trading, image-based AI, biomedical modeling, or recommender systems industries
Strong communication skills to interact effectively with both technical and non-technical stakeholders.
Excellent problem-solving skills with the ability to work independently and as part of a team.

About the Role:
We are looking for a highly skilled Data Engineer with a strong foundation in Power BI, SQL, Python, and Big Data ecosystems to help design, build, and optimize end-to-end data solutions. The ideal candidate is passionate about solving complex data problems, transforming raw data into actionable insights, and contributing to data-driven decision-making across the organization.
Key Responsibilities:
- Data Modelling & Visualization
- Build scalable and high-quality data models in Power BI using best practices.
- Define relationships, hierarchies, and measures to support effective storytelling.
- Ensure dashboards meet standards in accuracy, visualization principles, and timelines.
- Data Transformation & ETL
- Perform advanced data transformation using Power Query (M Language) beyond UI-based steps.
- Design and optimize ETL pipelines using SQL, Python, and Big Data tools.
- Manage and process large-scale datasets from various sources and formats.
- Business Problem Translation
- Collaborate with cross-functional teams to translate complex business problems into scalable, data-centric solutions.
- Decompose business questions into testable hypotheses and identify relevant datasets for validation.
- Performance & Troubleshooting
- Continuously optimize performance of dashboards and pipelines for latency, reliability, and scalability.
- Troubleshoot and resolve issues related to data access, quality, security, and latency, adhering to SLAs.
- Analytical Storytelling
- Apply analytical thinking to design insightful dashboards—prioritizing clarity and usability over aesthetics.
- Develop data narratives that drive business impact.
- Solution Design
- Deliver wireframes, POCs, and final solutions aligned with business requirements and technical feasibility.
Required Skills & Experience:
- Minimum 3+ years of experience as a Data Engineer or in a similar data-focused role.
- Strong expertise in Power BI: data modeling, DAX, Power Query (M Language), and visualization best practices.
- Hands-on with Python and SQL for data analysis, automation, and backend data transformation.
- Deep understanding of data storytelling, visual best practices, and dashboard performance tuning.
- Familiarity with DAX Studio and Tabular Editor.
- Experience in handling high-volume data in production environments.
Preferred (Good to Have):
- Exposure to Big Data technologies such as:
- PySpark
- Hadoop
- Hive / HDFS
- Spark Streaming (optional but preferred)
Why Join Us?
- Work with a team that's passionate about data innovation.
- Exposure to modern data stack and tools.
- Flat structure and collaborative culture.
- Opportunity to influence data strategy and architecture decisions.
Job Title : Technology Strategy and Reporting Specialist
Total Experience : 8+ Years
Relevant Experience : Minimum 3 years of experience in Strategic technology planning and reporting.
Location : Remote, Anywhere in India
Job Type : Full Time
Job Overview :
We are seeking a Technology Strategy and Reporting Specialist to lead the development of strategic dashboards, quarterly board reports, and executive-level communications.
The ideal candidate will bridge the gap between technical insights and business strategy, delivering high-impact reports that drive executive decision-making and organizational performance.
Mandatory Skills :
Power BI or Tableau, Advanced Excel, Executive Reporting, Data Analysis, Strategic Dashboarding, IT KPIs, Storytelling, and Communication Skills
Key Responsibilities :
- Quarterly Board Reporting : Deliver structured, governance-aligned technology reports with KPIs, risk assessments, and strategic recommendations for executive leadership and the board.
- Dashboard Development : Build and manage executive dashboards using tools like Power BI/Tableau to showcase real-time performance and strategic metrics.
- Executive Storytelling : Create polished narratives and presentations that translate technical data into compelling business insights.
- Data Analysis & Insights : Analyze performance data, market trends, and technology strategies to identify actionable insights and business opportunities.
- Reporting Framework : Standardize reporting templates, processes, and governance models to ensure consistent, high-quality outputs.
- Technology Performance Monitoring : Track initiative outcomes, ROI, and budget adherence with scorecards and strategic alignment reports.
Desired Experience :
- 3+ Years in business intelligence, strategic reporting, or executive communication.
- Hands-on experience with dashboard tools (Power BI, Tableau, Excel).
- Strong understanding of IT performance metrics and KPIs.
- Proven record of creating board-level/C-suite presentations.
Qualifications :
- Bachelor's in Business Analytics, Computer Science, Information Systems, Engineering, or related field (Master’s preferred).
- Certifications such as PMP, Power BI, Tableau, or Six Sigma are a plus.
Job Summary:
Position : Senior Power BI Developer
Experience : 4+Years
Location : Ahmedabad - WFO
Key Responsibilities:
- Design, develop, and maintain interactive and user-friendly Power BI dashboards and
- reports.
- Translate business requirements into functional and technical specifications.
- Perform data modeling, DAX calculations, and Power Query transformations.
- Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs.
- Optimize Power BI datasets, reports, and dashboards for performance and usability.
- Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy
- and relevance.
- Ensure security and governance best practices in Power BI workspaces and datasets.
- Provide ongoing support and troubleshooting for existing Power BI solutions.
- Stay updated with Power BI updates, best practices, and industry trends.
Required Skills & Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or a
- related field.
- 4+ years of professional experience in data analytics or business intelligence.
- 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service).
- Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema).
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Experience in working with large and complex datasets.
- Experience in BigQuery, MySql, Looker Studio is a plus.
- Ecommerce Industry Experience will be an added advantage.
- Solid understanding of data warehousing concepts and ETL processes.
- Experience with version control tools such as Power Apps & Power Automate would be a plus.
Preferred Qualifications:
- Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse).
- Knowledge of other BI tools (Tableau, Qlik) is a plus.
- Familiarity with scripting languages (Python, R) for data analysis is a bonus.
- Experience integrating Power BI into web portals using Power BI Embedded.



We are seeking a detail-oriented and analytical Data Analyst to collect, process, and analyze data to help drive informed business decisions. The ideal candidate will have strong technical skills, business acumen, and the ability to communicate insights effectively.

We are looking for an experienced and dynamic to join our team. The ideal candidate will be responsible for designing, developing, and delivering high-quality technical training programs to students.


We are looking for a dynamic and skilled Business Analyst Trainer with 2 to 5 years of hands-on industry and/or teaching experience. The ideal candidate should be able to simplify complex data concepts, mentor aspiring professionals, and deliver effective training programs in Business Analyst, Power BI, Tableau, Machine learning
We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.
Job Description :
We are seeking a highly experienced Sr Data Modeler / Solution Architect to join the Data Architecture team at Corporate Office in Bangalore. The ideal candidate will have 4 to 8 years of experience in data modeling and architecture with deep expertise in AWS cloud stack, data warehousing, and enterprise data modeling tools. This individual will be responsible for designing and creating enterprise-grade data models and driving the implementation of Layered Scalable Architecture or Medallion Architecture to support robust, scalable, and high-quality data marts across multiple business units.
This role will involve managing complex datasets from systems like PoS, ERP, CRM, and external sources, while optimizing performance and cost. You will also provide strategic leadership on data modeling standards, governance, and best practices, ensuring the foundation for analytics and reporting is solid and future-ready.
Key Responsibilities:
· Design and deliver conceptual, logical, and physical data models using tools like ERWin.
· Implement Layered Scalable Architecture / Medallion Architecture for building scalable, standardized data marts.
· Optimize performance and cost of AWS-based data infrastructure (Redshift, S3, Glue, Lambda, etc.).
· Collaborate with cross-functional teams (IT, business, analysts) to gather data requirements and ensure model alignment with KPIs and business logic.
· Develop and optimize SQL code, materialized views, stored procedures in AWS Redshift.
· Ensure data governance, lineage, and quality mechanisms are established across systems.
· Lead and mentor technical teams in an Agile project delivery model.
· Manage data layer creation and documentation: data dictionary, ER diagrams, purpose mapping.
· Identify data gaps and availability issues with respect to source systems.
Required Skills & Qualifications:
· Bachelor’s or Master’s degree in Computer Science, IT, or related field (B.E./B.Tech/M.E./M.Tech/MCA).
· Minimum 4 years of experience in data modeling and architecture.
· Proficiency with data modeling tools such as ERWin, with strong knowledge of forward and reverse engineering.
· Deep expertise in SQL (including advanced SQL, stored procedures, performance tuning).
· Strong experience in data warehousing, RDBMS, and ETL tools like AWS Glue, IBM DataStage, or SAP Data Services.
· Hands-on experience with AWS services: Redshift, S3, Glue, RDS, Lambda, Bedrock, and Q.
· Good understanding of reporting tools such as Tableau, Power BI, or AWS QuickSight.
· Exposure to DevOps/CI-CD pipelines, AI/ML, Gen AI, NLP, and polyglot programming is a plus.
· Familiarity with data governance tools (e.g., ORION/EIIG).
· Domain knowledge in Retail, Manufacturing, HR, or Finance preferred.
· Excellent written and verbal communication skills.
Certifications (Preferred):
· AWS Certification (e.g., AWS Certified Solutions Architect or Data Analytics – Specialty)
· Data Governance or Data Modeling Certifications (e.g., CDMP, Databricks, or TOGAF)
Mandatory Skills
aws, Technical Architecture, Aiml, SQL, Data Warehousing, Data Modelling
Job Description:
We are seeking a skilled Power BI Developer with a strong understanding of Capital Markets to join our data analytics team. The ideal candidate will be responsible for designing, developing, and maintaining interactive dashboards and reports that provide insights into trading, risk, and financial performance. This role requires experience working with capital market data sets and a solid grasp of financial instruments and market operations.
Key Responsibilities:
- Develop interactive Power BI dashboards and reports tailored to capital markets (e.g., equities, derivatives, fixed income).
- Connect to and integrate data from various sources such as Bloomberg, Reuters, SQL databases, and Excel.
- Translate business requirements into data models and visualizations that provide actionable insights.
- Optimize Power BI reports for performance, usability, and scalability.
- Work closely with business stakeholders (trading, risk, compliance) to understand KPIs and analytics needs.
- Implement row-level security and data access controls.
- Maintain data quality, lineage, and versioning documentation.
Required Skills & Qualifications:
- 3+ years of experience with Power BI (Power Query, DAX, data modeling).
- Strong understanding of capital markets: trading workflows, market data, instruments (equities, bonds, derivatives, etc.).
- Experience with SQL and working with large financial datasets.
- Familiarity with risk metrics, trade lifecycle, and financial statement analysis.
- Knowledge of data governance, security, and performance tuning in BI environments.
- Excellent communication skills and ability to work with cross-functional teams.
Preferred Qualifications:
- Experience with Python or R for data analysis.
- Knowledge of investment banking or asset management reporting frameworks.
- Exposure to cloud platforms like Azure, AWS, or GCP.
- Certifications in Power BI or Capital Markets.

Only candidates currently in Bihar or Open to relocate to Bihar, please apply:
Job Description:
This is an exciting opportunity for an experienced industry professional with strong analytical and technical skills to join and add value to a dedicated and friendly team. We are looking for a Data Analyst who is driven by data-driven decision-making and insights. As a core member of the Analytics Team, the candidate will take ownership of data analysis projects by working independently with little supervision.
The ideal candidate is a highly resourceful and innovative professional with extensive experience in data analysis, statistical modeling, and data visualization. The candidate must have a strong command of data analysis tools like SAS/SPSS, Power BI/Tableau, or R, along with expertise in MS Excel and MS PowerPoint. The role requires optimizing data collection procedures, generating reports, and applying statistical techniques for hypothesis testing and data interpretation.
Key Responsibilities:
• Perform data analysis using tools like SAS, SPSS, Power BI, Tableau, or R.
• Optimize data collection procedures and generate reports on a weekly, monthly, and quarterly basis.
• Utilize statistical techniques for hypothesis testing to validate data and interpretations.
• Apply data mining techniques and OLAP methodologies for in-depth insights.
• Develop dashboards and data visualizations to present findings effectively.
• Collaborate with cross-functional teams to define, design, and execute data-driven strategies.
• Ensure the accuracy and integrity of data used for analysis and reporting.
• Utilize advanced Excel skills to manipulate and analyze large datasets.
• Prepare technical documentation and presentations for stakeholders.
Candidate Profile:
Required Qualifications:
• Qualification: MCA / Graduate / Post Graduate in Statistics or MCA or BE/B.Tech in Computer Science & Engineering, Information Technology, or Electronics.
• A minimum of 2 years' experience in data analysis using SAS/SPSS, Power BI/Tableau, or R.
• Proficiency in MS Office with expertise in MS Excel & MS PowerPoint.
• Strong analytical skills with attention to detail.
• Experience in data mining and OLAP methodologies.
• Ability to generate insights and reports based on data trends.
• Excellent communication and presentation skills.
Desired Qualifications:
• Experience in predictive analytics and machine learning techniques.
• Knowledge of SQL and database management.
• Familiarity with Python for data analysis.
• Experience in automating reporting processes.
We are seeking a detail-oriented and analytical Business Analyst to bridge the gap between business needs and technology solutions. The ideal candidate will be responsible for analyzing business processes, identifying improvement areas, and supporting data-driven decision-making through insights and documentation.


We are seeking a passionate and knowledgeable Data Science and Data Analyst Trainer to deliver engaging and industry-relevant training programs. The trainer will be responsible for teaching core concepts in data analytics, machine learning, data visualization, and related tools and technologies. The ideal candidate will have hands-on experience in the data domain with 2-5 years and a flair for teaching and mentoring students or working professionals.


We are looking for a dynamic and skilled Data Science and Data Analyst Trainer with 2 to 5 years of hands-on industry and/or teaching experience. The ideal candidate should be able to simplify complex data concepts, mentor aspiring professionals, and deliver effective training programs in data analytics, data science, and business intelligence tools.


We are seeking a dynamic and experienced Data Analytics and Data Science Trainer to deliver high-quality training sessions, mentor learners, and design engaging course content. The ideal candidate will have a strong foundation in statistics, programming, and data visualization tools, and should be passionate about teaching and guiding aspiring professionals.

Job Title: Manager - Retail Analyst and Communication (Apparel Retail)
Experience: 6-8 years in Retail Business Management Information System
Location: Gurgaon
Salary: Negotiable
Industry: Retail/ Apparel/ Fashion
This role is responsible for managing the end-to-end Retail Business Management Information System (MIS). It involves handling internal communication channels, ensuring adherence to reporting processes and implementing weekly actions based on data insights.
Key Deliverables (Essential functions & Responsibilities of the Job):
· Own and manage all aspects of the MIS and data systems, including report and dashboard generation.
· Coordinate with cross-functional teams for data collection, reporting, and timely store projects.
· Lead monthly, quarterly, and annual target setting for sales and key retail performance metrics.
· Monitor weekly, monthly, and seasonal performance, and circulate actionable insights and reports to stores and leadership.
· Deliver weekly retail communications, ensuring clarity on actions required.
· Support ad hoc data requirements with advanced Excel capabilities.
· Work with tools like SAP, OneDrive, Tableau, Power BI, etc.
Key Skills Required:
· Strong multitasking and time management skills.
· Self-driven, organized, and goal-oriented.
· Analytical mindset with the ability to convert data into actionable insights.
· Proficiency in Microsoft Office Suite, particularly Excel; familiarity with data visualization tools preferred.
mail updated resume with current salary-
email: etalenthire[at]gmail[dot]com
satish: 88 O2 74 97 43
website: www.glansolutions.com

We’re seeking a detail-oriented Data Analyst with proven experience in corporate settings to transform raw data into actionable insights. You’ll collaborate across departments to support strategic decision-making, optimize operations, and enhance business performance through data-driven analysis.

We are looking for a skilled and detail-oriented Data Analyst – Data Scientist (DA-DS) to join our team. This hybrid role involves analyzing large datasets to extract insights, build predictive models, and support data-driven decision-making. You’ll work closely with cross-functional teams to transform raw data into actionable insights using statistical techniques, data visualization, and machine learning tools.
Immediate Hiring for Business Analyst
Position: Business Analyst
Experiance : 5 - 8 Years
Location:Hyderabad
Job Summary:
We are seeking a motivated and detail-oriented Business Analyst with 5 years of experience in the Travel domain. The ideal candidate will have a strong understanding of the travel industry, including airlines, travel agencies, and online booking systems. You will work closely with cross-functional teams to gather business requirements, analyze processes, and deliver solutions that improve customer experience and operational efficiency.
Key Responsibilities:
- Requirement Gathering & Analysis: Collaborate with stakeholders to gather, document, and analyze business requirements, ensuring alignment with business goals.
- Process Improvement: Identify opportunities for process improvement and optimization in travel booking, ticketing, and customer support systems.
- Stakeholder Communication: Act as the bridge between the business stakeholders and technical teams, ensuring clear communication of requirements, timelines, and deliverables.
- Solution Design: Participate in the design and development of solutions, collaborating with IT and development teams to ensure business needs are met.
- Data Analysis: Analyze data related to customer journeys, bookings, and cancellations to identify trends and insights for decision-making.
- Documentation: Prepare detailed documentation including business requirements documents (BRD), user stories, process flows, and functional specifications.
- Testing & Validation: Support testing teams during User Acceptance Testing (UAT) to ensure solutions meet business needs, and facilitate issue resolution.
- Market Research: Stay up to date with travel industry trends, customer preferences, and competitor offerings to ensure innovative solutions are delivered.
Qualifications & Skills:
- Education: Bachelor’s degree in Business Administration, Information Technology, or a related field.
- Experience:
- 5 years of experience as a Business Analyst in the travel industry.
- Hands-on experience in working with travel booking systems (GDS, OTA) is highly preferred.
- Domain Knowledge:
- Strong understanding of the travel industry, including booking engines, reservations, ticketing, cancellations, and customer support.
- Familiarity with industry-specific regulations and best practices.
- Analytical Skills: Excellent problem-solving skills with the ability to analyze complex data and business processes.
- Technical Skills:
- Proficiency in Microsoft Office (Word, Excel, PowerPoint).
- Knowledge of SQL or data visualization tools (Power BI, Tableau) is a plus.
- Communication: Strong verbal and written communication skills with the ability to convey complex information clearly.
- Attention to Detail: Strong focus on accuracy and quality of work, ensuring that solutions meet business requirements.
Preferred:
- Prior experience with Agile methodologies.
- Certification in Business Analysis (CBAP or similar).

We are seeking a highly motivated and knowledgeable DADS Trainer to conduct hands-on training in Data Analytics and Data Science. The ideal candidate will have strong domain expertise, coding proficiency, and a passion for teaching concepts in Python, statistics, machine learning, data visualization, and tools like Excel, Power BI, and SQL.



About the Role:
We are looking for a Senior Technical Customer Success Manager to join our growing team. This is a client-facing role focused on ensuring successful adoption and value realization of our SaaS solutions. The ideal candidate will come from a strong analytics background, possess hands-on skills in SQL and Python or R, and have experience working with dashboarding tools. Prior experience in eCommerce or retail domains is a strong plus.
Responsibilities:
- Own post-sale customer relationship and act as the primary technical point of contact.
- Drive product adoption and usage through effective onboarding, training, and ongoing support.
- Work closely with clients to understand business goals and align them with product capabilities.
- Collaborate with internal product, engineering, and data teams to deliver solutions and enhancements tailored to client needs.
- Analyze customer data and usage trends to proactively identify opportunities and risks.
- Build dashboards or reports for customers using internal tools or integrations.
- Lead business reviews, share insights, and communicate value delivered.
- Support customers in configuring rules, data integrations, and troubleshooting issues.
- Drive renewal and expansion by ensuring customer satisfaction and delivering measurable outcomes.
Requirements:
- 7+ years of experience in a Customer Success, Technical Account Management, or Solution Consulting role in a SaaS or software product company.
- Strong SQL skills and working experience with Python or R.
- Experience with dashboarding tools such as Tableau, Power BI, Looker, or similar.
- Understanding of data pipelines, APIs, and data modeling.
- Excellent communication and stakeholder management skills.
- Proven track record of managing mid to large enterprise clients.
- Experience in eCommerce, retail, or consumer-facing businesses is highly desirable.
- Ability to translate technical details into business context and vice versa.
- Bachelor’s or Master’s degree in Computer Science, Analytics, Engineering, or related field.
Nice to Have:
- Exposure to machine learning workflows, recommendation systems, or pricing analytics.
- Familiarity with cloud platforms (AWS/GCP/Azure).
- Experience working with cross-functional teams in Agile environments.

Senior Data Engineer Job Description
Overview
The Senior Data Engineer will design, develop, and maintain scalable data pipelines and
infrastructure to support data-driven decision-making and advanced analytics. This role requires deep
expertise in data engineering, strong problem-solving skills, and the ability to collaborate with
cross-functional teams to deliver robust data solutions.
Key Responsibilities
Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data
pipelines to ingest, process, and transform large volumes of structured and unstructured data.
Data Architecture: Architect and maintain data storage solutions, including data lakes, data
warehouses, and databases, ensuring performance, scalability, and cost-efficiency.
Data Integration: Integrate data from diverse sources, including APIs, third-party systems,
and streaming platforms, ensuring data quality and consistency.
Performance Optimization: Monitor and optimize data systems for performance, scalability,
and cost, implementing best practices for partitioning, indexing, and caching.
Collaboration: Work closely with data scientists, analysts, and software engineers to
understand data needs and deliver solutions that enable advanced analytics, machine
learning, and reporting.
Data Governance: Implement data governance policies, ensuring compliance with data
security, privacy regulations (e.g., GDPR, CCPA), and internal standards.
Automation: Develop automated processes for data ingestion, transformation, and validation
to improve efficiency and reduce manual intervention.
Mentorship: Guide and mentor junior data engineers, fostering a culture of technical
excellence and continuous learning.
Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high
availability and reliability of data systems.
Required Qualifications
Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,
or a related field.
Experience: 5+ years of experience in data engineering or a related role, with a proven track
record of building scalable data pipelines and infrastructure.
Technical Skills:
Proficiency in programming languages such as Python, Java, or Scala.
Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).
Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services
(e.g., Redshift, BigQuery, Snowflake).
Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and
data integration frameworks.
Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed
systems.
Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a
plus.
Soft Skills:
Excellent problem-solving and analytical skills.
Strong communication and collaboration abilities.
Ability to work in a fast-paced, dynamic environment and manage multiple priorities.
Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,
Google Professional Data Engineer) or relevant data engineering certifications.
Preferred Qualifica
Experience with real-time data processing and streaming architectures.
Familiarity with machine learning pipelines and MLOps practices.
Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data
pipelines.
Experience in industries with high data complexity, such as finance, healthcare, or
e-commerce.
Work Environment
Location: Hybrid/Remote/On-site (depending on company policy).
Team: Collaborative, cross-functional team environment with data scientists, analysts, and
business stakeholders.
Hours: Full-time, with occasional on-call responsibilities for critical data systems.
Job Summary:
Position : Senior Power BI Developer
Experience : 4+Years
Location : Ahmedabad - WFO
Key Responsibilities:
- Design, develop, and maintain interactive and user-friendly Power BI dashboards and
- reports.
- Translate business requirements into functional and technical specifications.
- Perform data modeling, DAX calculations, and Power Query transformations.
- Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs.
- Optimize Power BI datasets, reports, and dashboards for performance and usability.
- Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy
- and relevance.
- Ensure security and governance best practices in Power BI workspaces and datasets.
- Provide ongoing support and troubleshooting for existing Power BI solutions.
- Stay updated with Power BI updates, best practices, and industry trends.
Required Skills & Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or a
- related field.
- 4+ years of professional experience in data analytics or business intelligence.
- 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service).
- Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema).
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Experience in working with large and complex datasets.
- Experience in BigQuery, MySql, Looker Studio is a plus.
- Ecommerce Industry Experience will be an added advantage.
- Solid understanding of data warehousing concepts and ETL processes.
- Experience with version control tools such as Power Apps & Power Automate would be a plus.
Preferred Qualifications:
- Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse).
- Knowledge of other BI tools (Tableau, Qlik) is a plus.
- Familiarity with scripting languages (Python, R) for data analysis is a bonus.
- Experience integrating Power BI into web portals using Power BI Embedded.

Description
Job Description
Founded in 2015, GreenTree Group is a cutting-edge real estate investment and asset management firm based in China. At GreenTree, we focus on acquiring high-quality, low-cost assets at significant discounts and transforming them through repositioning, leasing, and reselling. Leveraging our expertise and deep local networks, we consistently identify and unlock hidden value in overlooked opportunities using innovative, technology-driven methods.
Job Highlights
As a Data Science Analyst Intern for Microsoft Power Automate Projects, you will bridge the gap between data analytics and automation. You will play a pivotal role in designing, developing, and refining automated data workflows that enhance our business processes. By collaborating with cross-functional teams, you will help drive operational efficiency and data-driven decision-making across the organization—all while gaining real-world experience in data science and process automation.
Responsibilities
Design & Development: Work on building and optimizing automation workflows using Microsoft Power Automate to streamline critical business processes.
Data Integration: Collect and integrate data from various sources such as databases, APIs, spreadsheets, and web services to support analytic initiatives.
Analysis & Reporting: Utilize tools like Excel, SQL, and Python to analyze data, identify trends, and generate actionable insights that inform strategic decisions.
Collaboration: Partner with data, operations, and engineering teams to identify automation opportunities that improve efficiency and reduce manual effort.
Documentation: Develop process documentation, including detailed workflow diagrams, to ensure clarity and repeatability of automation processes.
Continuous Improvement: Assist in testing, deployment, and iterative refinement of automated solutions to maintain optimal performance and reliability.
What We Offer
Comprehensive Training: Access to extensive training materials and support to enhance your technical and professional skills.
Mentorship: Regular feedback and guidance sessions with supervisors and directors to foster your growth in data analytics and process automation.
Career Development: Opportunities for increased responsibility and project leadership based on performance and results.
Real-World Impact: Direct involvement in high-impact projects that drive operational improvements and technological innovation.
Requirements
Educational Background: Pursuing or holding a degree in Data Science, Computer Science, Information Systems, or a related field.
Technical Skills: Proficiency in data analysis using Excel; familiarity with SQL, Python, or other programming languages is preferred.
Automation Exposure: Exposure to Microsoft Power Automate, Power Apps, or similar RPA tools is an asset.
Analytical Mindset: Strong logical reasoning and problem-solving capabilities, with an eagerness to learn and explore new solutions.
Communication: Excellent verbal and written communication skills, with the ability to clearly present complex information.
Team Player: A proactive and collaborative approach to working within cross-functional teams.
About GreenTree GreenTree Group is dedicated to delivering innovative solutions in real estate investment and asset management. Our mission is to build lasting relationships with our investors by utilizing advanced technology to optimize processes and deliver exceptional value. We pride ourselves on our commitment to excellence, creativity, and integrity in every project we undertake.
Location Shanghai, China
Start your career with GreenTree and gain invaluable experience at the intersection of data science, automation, and real estate operations!
Job Type: Full-time
Pay: $50 per month
Work Location: Remote

Job Purpose
Responsible for managing end-to-end database operations, ensuring data accuracy, integrity, and security across systems. The position plays a key role in driving data reliability, availability, and compliance with operational standards.
Key Responsibilities:
- Collate audit reports from the QA team and structure data in accordance with Standard Operating Procedures (SOP).
- Perform data transformation and validation for accuracy and consistency.
- Upload processed datasets into SQL Server using SSIS packages.
- Monitor and optimize database performance, identifying and resolving bottlenecks.
- Perform regular backups, restorations, and recovery checks to ensure data continuity.
- Manage user access and implement robust database security policies.
- Oversee database storage allocation and utilization.
- Conduct routine maintenance and support incident management, including root cause analysis and resolution.
- Design and implement scalable database solutions and architecture.
- Create and maintain stored procedures, views, and other database components.
- Optimize SQL queries for performance and scalability.
- Execute ETL processes and support seamless integration of multiple data sources.
- Maintain data integrity and quality through validation and cleansing routines.
- Collaborate with cross-functional teams on data solutions and project deliverables.
Educational Qualification: Any Graduate
Required Skills & Qualifications:
- Proven experience with SQL Server or similar relational database platforms.
- Strong expertise in SSIS, ETL processes, and data warehousing.
- Proficiency in SQL/T-SQL, including scripting, performance tuning, and query optimization.
- Experience in database security, user role management, and access control.
- Familiarity with backup/recovery strategies and database maintenance best practices.
- Strong analytical skills with experience working with large and complex datasets.
- Solid understanding of data modeling, normalization, and schema design.
- Knowledge of incident and change management processes.
- Excellent communication and collaboration skills.
- Experience with Python for data manipulation and automation is a strong plus.

Location: Mumbai
Job Type: Full-Time (Hybrid – 3 days in office, 2 days WFH)
Job Overview:
We are looking for a skilled Azure Data Engineer with strong experience in data modeling, pipeline development, and SQL/Spark expertise. The ideal candidate will work closely with the Data Analytics & BI teams to implement robust data solutions on Azure Synapse and ensure seamless data integration with third-party applications.
Key Responsibilities:
- Design, develop, and maintain Azure data pipelines using Azure Synapse (SQL dedicated pools or Apache Spark pools).
- Implement data models in collaboration with the Data Analytics and BI teams.
- Optimize and manage large-scale SQL and Spark-based data processing solutions.
- Ensure data availability and reliability for third-party application consumption.
- Collaborate with cross-functional teams to translate business requirements into scalable data solutions.
Required Skills & Experience:
3–5 years of hands-on experience in:
- Azure data services
- Data Modeling
- SQL development and tuning
- Apache Spark
- Strong knowledge of Azure Synapse Analytics.
- Experience in designing data pipelines and ETL/ELT processes.
- Ability to troubleshoot and optimize complex data workflows.
Preferred Qualifications:
- Experience with data governance, security, and data quality practices.
- Familiarity with DevOps practices in a data engineering context.
- Effective communication skills and the ability to work in a collaborative team environment.

About the Role:
We are looking for a skilled and detail-oriented Data Analyst to join our team. The ideal candidate will be responsible for collecting, analyzing, and interpreting large datasets to support data-driven decision-making across the organization. Proficiency in MongoDB and SQL is essential for this role.
Key Responsibilities:
- Collect, process, and clean structured and unstructured data from various sources.
- Analyze data using SQL queries and MongoDB aggregations to extract insights.
- Develop and maintain dashboards, reports, and visualizations to present data in a meaningful way.
- Collaborate with cross-functional teams to identify business needs and provide data-driven solutions.
- Monitor data quality and integrity, ensuring accuracy and consistency.
- Support the development of predictive models and data pipelines.
Required Skills & Qualifications:
- Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field.
- Proven experience as a Data Analyst or similar role.
- Strong proficiency in SQL for data querying and manipulation.
- Hands-on experience with MongoDB, including working with collections, documents, and aggregations.
- Knowledge of data visualization tools such as Tableau, Power BI, or similar (optional but preferred).
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder management abilities.
Good to Have:
- Experience with Python/R for data analysis.
- Exposure to ETL tools and data warehousing concepts.
- Understanding of statistical methods and A/B testing.
Job Title: Tableau BI Developer
Years of Experience: 4-8Yrs
12$ per hour fte engagement
8 hrs. working
Required Skills & Experience:
✅ 4–8 years of experience in BI development and data engineering
✅ Expertise in BigQuery and/or Snowflake for large-scale data processing
✅ Strong SQL skills with experience writing complex analytical queries
✅ Experience in creating dashboards in tools like Power BI, Looker, or similar
✅ Hands-on experience with ETL/ELT tools and data pipeline orchestration
✅ Familiarity with cloud platforms (GCP, AWS, or Azure)
✅ Strong understanding of data modeling, data warehousing, and analytics best practices
✅ Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders
Job Title : SAP BW/HANA Consultant with ABAP & Power BI
Total Experience : 8+ Years
Relevant Experience : 6+ Years
Location : Taramani, Chennai
Interview Mode : Virtual
Notice Period: Immediate Joiners Only
Job Summary :
We are looking for a seasoned professional with a strong background in Data Warehousing and hands-on experience with SAP BW/HANA. The ideal candidate should possess expertise in ABAP programming, have working knowledge or exposure to Power BI, and show interest in learning cloud technologies.
Mandatory Skills : Data Warehousing, SAP BW, SAP HANA, ABAP Programming, Power BI (basic knowledge), Cloud Technologies (willingness to learn).
Key Responsibilities :
- Design, develop, and support data warehouse solutions using SAP BW and HANA.
- Write efficient ABAP programs to support custom functionality within SAP environments.
- Integrate data and visualizations using Power BI (working knowledge acceptable).
- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions.
- Stay updated with cloud technology trends and show willingness to adapt and learn new tools and platforms.
Mandatory Skills :
- Strong knowledge of Data Warehousing.
- Hands-on experience with SAP BW and SAP HANA.
- Proficiency in ABAP programming.
- Exposure or experience with Power BI.
- Interest or basic exposure to cloud technologies.

AccioJob is conducting an offline hiring drive in partnership with Our Partner Company to hire Junior Business/Data Analysts for an internship with a Pre-Placement Offer (PPO) opportunity.
Apply, Register and select your Slot here: https://go.acciojob.com/69d3Wd
Job Description:
- Role: Junior Business/Data Analyst (Internship + PPO)
- Work Location: Hyderabad
- Internship Stipend: 15,000 - 25,000/month
- Internship Duration: 3 months
- CTC on PPO: 5 LPA - 6 LPA
Eligibility Criteria:
- Degree: Open to all academic backgrounds
- Graduation Year: 2023, 2024, 2025
Required Skills:
- Proficiency in SQL, Excel, Power BI, and basic Python
- Strong analytical mindset and interest in solving business problems with data
Hiring Process:
- Offline Assessment at AccioJob Skill Centres (Hyderabad, Pune, Noida)
- 1 Assignment + 2 Technical Interviews (Virtual; In-person for Hyderabad candidates)
Note: Please bring your laptop and earphones for the test.
Register Here: https://go.acciojob.com/69d3Wd
Job Role - Power BI Lead
9 to 12 Years Experience Required.
Location - Pune Baner/ Vimaan Nagar
Work Model - Hybrid (Wednesday and Thursday WFO) 12 PM to 9 PM
Experience with Banking or GRC Domain is preferred.
- JOB SUMMARY
- Role Overview: We are seeking a highly skilled Power BI Expert to design, develop, implement and governance of Power BI solutions. Ideal candidate will have in-depth knowledge of Power BI architecture, data modeling, governance, embedded analytics and database management. The role requires expertise Power BI Data Gateways, report deployment, and governance frameworks ensuring scalable and secure data solutions.
- PRIMARY RESPONSIBILITIES
- Power BI Lead & Implementation:
- Design, develop, and deploy interactive Power BI reports and dashboards.
- Create efficient data models to optimize performance and scalability.
- Develop complex DAX expressions for business logic and calculations.
- Optimize report performance by using best practices in Power BI and SQL
- Power BI Architecture & Configuration:
- Configure and manage Power BI Data Gateways for secure and seamless data access
- Define and enforce Power BI workspace, dataset, and security policies.
- Implement row-level security (RLS) and data governance best practices.
- Establish data refresh schedules and ensure efficient data ingestion pipelines.
- Maintain and enhance Power BI Premium and Embedded solutions.
- Embedded Analytics & Integration:
- Integrate Power BI reports with external applications using Power BI Embedded.
- Work with Power BI REST APIs to automate workflows.
- Integrate Power BI with Oracle, SQL Server, MySQL, Microsoft Share point, Excel, Cloud data source etc.,
- Database & Performance Optimization:
- Write optimized SQL queries and stored procedures for report development.
- Ensure high-performance data refreshes and query execution.
- Work with ETL team to improve data integration with PowerBI
- Governance & Security:
- Define Power BI governance framework and best practices for standardization.
- Monitor user access, performance, and usage analytics to drive efficiency.
- Manage user roles, access controls, and data security.
- PowerApps & Power Automate (Nice to Have):
- Build PowerApps applications to extend Power BI functionality and create
- interactive business solutions
- Automate data flows and reporting updates using Power Automate (Flows,
- Triggers, Approvals, Notifications, etc.).
- Integrate Power BI, PowerApps, and Power Automate to create end-to-end
- business process automation.
- Stakeholder Collaboration
- Training:
- Work closely with business users, data engineers, and leadership teams to understand and document reporting requirements.
- Provide training and best practice guidance to Power BI users across the organization.
- Develop self-service Power BI frameworks to empower business teams for reporting.
- Troubleshoot Power BI performance and user issues.

This is a fulltime onsite trainer cum developer role. To prepare placement students with the technical knowledge, skills, and confidence required to succeed in campus recruitment drives, technical interviews, and entry-level job roles in the industry.
What You Will Do:
Following are high level responsibilities that you will play but not limited to:
· Develop and maintain data pipelines using Azure Data Factory (ADF) /Databricks for data integration and ETL processes.
· Design, implement, and optimize Power BI /Fabric reports and dashboards to deliver actionable business insights.
· Collaborate with data analysts, business users, and other teams to understand data requirements and deliver solutions using ADF and Power BI.
· Extract, transform, and load (ETL) data from various sources into cloud-based storage systems such as Azure Data Lake or Azure SQL Database.
· Work with large datasets and optimize queries and pipelines for performance and scalability.
· Ensure data quality, integrity, and availability throughout the data lifecycle.
· Automate repetitive data tasks, ensuring timely and accurate reporting.
· Monitor and troubleshoot data pipelines, addressing any performance or data issues promptly.
· Support data visualization and reporting tools, including Power BI, to enable business stakeholders to make data-driven decisions.
· Write clear, efficient, and maintainable code for data transformations and automation.
Required Qualifications:
· Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
· 8+ years of hands-on experience in Data Engineering, BI Developer or a similar role.
· Proficiency with Azure Data Factory (ADF), including the creation of data pipelines and managing data flows.
· Strong experience in Power BI, including report creation, dashboard development, and data modeling.
· Experience with SQL and database management (e.g., Azure SQL Database, SQL Server).
· Knowledge of cloud platforms, especially Microsoft Azure.
· Familiarity with data warehousing concepts and ETL processes.
· Experience working with cloud-based data storage solutions (e.g., Azure Data Lake, Azure Blob Storage).
· Strong programming skills in languages such as Python, SQL, or other relevant languages.
· Ability to troubleshoot and optimize data pipelines for performance and reliability.
Preferred Qualifications:
· Familiarity with data modeling techniques and practices for Power BI.
· Knowledge of Azure Databricks or other data processing frameworks.
· Knowledge of Microsoft Fabric or other Cloud Platforms.
What we need?
· B. Tech computer science or equivalent.
Job Description:
- Design and deploy OTBI dashboards and subject areas.
- Build BI Publisher reports (PO Print, Invoice Print, custom KPIs).
- Configure bursting logic and scheduled reports.
- Work with finance and SCM teams to deliver audit-compliant reports.
- Experience Required: At least 1–2 Oracle Cloud BI/reporting implementations with enhancement support.
Tools:
BI Publisher, OTBI, SQL, Excel Basics & Advance, Power BI, other BI tools.
Certifications (Mandatory): Oracle BI or OTBI Certification.
Must have Skills: OTBI Subject Areas, BI Publisher Layouts, SQL, Scheduling, Bursting Logic
Good to have skills - ERP Reporting (Finance + SCM), Any other tool – Tableau or Inhouse Developed tools
Soft Skills: Strong Communication, Collaboration, Product Roadmap, Growth Mindset, Ability to Navigate challenges during conversation with Stakeholders.

Job Summary
We are looking for a highly driven and ambitious person with the drive to lead multi-layered teams and who place an inordinate amount of importance on learning and improvising. With a relentless focus on execution, you need to be highly data-driven and have a higher-than-normal sense of ownership. Prior experience in Operations/procurement/program management/ process standardization is mandatory.
Job Responsibilities:
- Take complete ownership of Project Execution (including P & L)
- Take up existing projects and optimize them by closely working with the product team.
- Handle general operations which include daily execution, driving volume, tracking progress, highlighting flags and daily reporting.
- Analyze training needs and provide training.
- Collect feedback on a regular basis and resolve any issues.
- Monitoring the team performance to achieve the KRA’s.
- Mentor and motivate the student workforce. Demonstrate strong people engagement skills.
- Run the pilot of projects.
- Suggest improvements in processes at every level in operations.
- Set goals/KPIs/targets for the team members.
Desired Skills
- 0-1 years of relevant experience in managing an operations team.
- Proven ability in driving tightly controlled operational metrics
- Strong process orientation & business acumen
- You should have good people management, team building and program management skills.
- Strongly inclined to do high-quality and impactful work in a dynamic and unstructured environment.
- Higher than normal sense of ownership with a clear bias for action.
- Relevant educational qualification.
- Must have impeccable verbal and written communication skills (Both English and Hindi).


Senior Data Engineer
Location: Bangalore, Gurugram (Hybrid)
Experience: 4-8 Years
Type: Full Time | Permanent
Job Summary:
We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.
This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.
Key Responsibilities:
PostgreSQL & Data Modeling
· Design and optimize complex SQL queries, stored procedures, and indexes
· Perform performance tuning and query plan analysis
· Contribute to schema design and data normalization
Data Migration & Transformation
· Migrate data from multiple sources to cloud or ODS platforms
· Design schema mapping and implement transformation logic
· Ensure consistency, integrity, and accuracy in migrated data
Python Scripting for Data Engineering
· Build automation scripts for data ingestion, cleansing, and transformation
· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)
· Maintain reusable script modules for operational pipelines
Data Orchestration with Apache Airflow
· Develop and manage DAGs for batch/stream workflows
· Implement retries, task dependencies, notifications, and failure handling
· Integrate Airflow with cloud services, data lakes, and data warehouses
Cloud Platforms (AWS / Azure / GCP)
· Manage data storage (S3, GCS, Blob), compute services, and data pipelines
· Set up permissions, IAM roles, encryption, and logging for security
· Monitor and optimize cost and performance of cloud-based data operations
Data Marts & Analytics Layer
· Design and manage data marts using dimensional models
· Build star/snowflake schemas to support BI and self-serve analytics
· Enable incremental load strategies and partitioning
Modern Data Stack Integration
· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka
· Support modular pipeline design and metadata-driven frameworks
· Ensure high availability and scalability of the stack
BI & Reporting Tools (Power BI / Superset / Supertech)
· Collaborate with BI teams to design datasets and optimize queries
· Support development of dashboards and reporting layers
· Manage access, data refreshes, and performance for BI tools
Required Skills & Qualifications:
· 4–6 years of hands-on experience in data engineering roles
· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)
· Advanced Python scripting skills for automation and ETL
· Proven experience with Apache Airflow (custom DAGs, error handling)
· Solid understanding of cloud architecture (especially AWS)
· Experience with data marts and dimensional data modeling
· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)
· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI
· Version control (Git) and CI/CD pipeline knowledge is a plus
· Excellent problem-solving and communication skills
About the Role
We are seeking an innovative Data Scientist specializing in Natural Language Processing (NLP) to join our technology team in Bangalore. The ideal candidate will harness the power of language models and document extraction techniques to transform legal information into accessible, actionable insights for our clients.
Responsibilities
- Develop and implement NLP solutions to automate legal document analysis and extraction
- Create and optimize prompt engineering strategies for large language models
- Design search functionality leveraging semantic understanding of legal documents
- Build document extraction pipelines to process unstructured legal text data
- Develop data visualizations using PowerBI and Tableau to communicate insights
- Collaborate with product and legal teams to enhance our tech-enabled services
- Continuously improve model performance and user experience.
Requirements
- Bachelor's degree in relevant field
- 1-5 years of professional experience in data science, with focus on NLP applications
- Demonstrated experience working with LLM APIs (e.g., OpenAI, Anthropic, )
- Proficiency in prompt engineering and optimization techniques
- Experience with document extraction and information retrieval systems
- Strong skills in data visualization tools, particularly PowerBI and Tableau
- Excellent programming skills in Python and familiarity with NLP libraries
- Strong understanding of legal terminology and document structures (preferred)
- Excellent communication skills in English
What We Offer
- Competitive salary and benefits package
- Opportunity to work at India's largest legal tech company
- Professional growth in the fast-evolving legal technology sector
- Collaborative work environment with industry experts
- Modern office located in Bangalore
- Flexible work arrangements
Qualified candidates are encouraged to apply with a resume highlighting relevant experience with NLP, prompt engineering, and data visualization tools.
Location: Bangalore, India
We’re looking for an experienced SQL Developer with 3+ years of hands-on experience to join our growing team. In this role, you’ll be responsible for designing, developing, and maintaining SQL queries, procedures, and data systems that support our business operations and decision-making processes. You should be passionate about data, highly analytical, and capable of working both independently and collaboratively with cross-functional teams.
Key Responsibilities:
Design, develop, and maintain complex SQL queries, stored procedures, functions, and views.
Optimize existing queries for performance and efficiency.
Collaborate with data analysts, developers, and stakeholders to understand requirements and translate them into robust SQL solutions.
Design and implement ETL processes to move and transform data between systems.
Perform data validation, troubleshooting, and quality checks.
Maintain and improve existing databases, ensuring data integrity, security, and accessibility.
Document code, processes, and data models to support scalability and maintainability.
Monitor database performance and provide recommendations for improvement.
Work with BI tools and support dashboard/report development as needed.
Requirements:
3+ years of proven experience as an SQL Developer or in a similar role.
Strong knowledge of SQL and relational database systems (e.g., MS SQL Server, PostgreSQL, MySQL, Oracle).
Experience with performance tuning and optimization.
Proficiency in writing complex queries and working with large datasets.
Experience with ETL tools and data pipeline creation.
Familiarity with data warehousing concepts and BI reporting.
Solid understanding of database security, backup, and recovery.
Excellent problem-solving skills and attention to detail.
Good communication skills and ability to work in a team environment.
Nice to Have:
Experience with cloud-based databases (AWS RDS, Google BigQuery, Azure SQL).
Knowledge of Python, Power BI, or other scripting/analytics tools.
Experience working in Agile or Scrum environments.

About the company:
Ketto is Asia's largest tech-enabled crowdfunding platform with a vision - Healthcare for all. We are a profit-making organization with a valuation of more than 100 Million USD. With over 1,100 crores raised from more than 60 lakh donors, we have positively impacted the lives of 2 lakh+ campaigners. Ketto has embarked on a high-growth journey, and we would like you to be part of our family, helping us to create a large-scale impact on a daily basis by taking our product to the next level
Role Overview:
Ketto, Asia's largest crowdfunding platform, is looking for an innovative Product Analyst to take charge of our data systems, reporting frameworks, and generative AI initiatives. This role is pivotal in ensuring data integrity and reliability, driving key insights that fuel strategic decisions, and implementing automation through AI. This position encompasses the full data and analytics lifecycle—from requirements gathering to design planning—alongside implementing advanced analytics and generative AI solutions to support Ketto’s mission.
Key Responsibilities
● Data Strategy & Automation:
○ Lead data collection, processing, and quality assurance processes to ensure accuracy, completeness, and relevance.
○ Explore opportunities to incorporate generative AI models to automate and optimize processes, enhancing efficiencies in analytics, reporting, and decision-making.
● Data Analysis & Insight Generation:
○ Conduct in-depth analyses of user behaviour, campaign performance, and platform metrics to uncover insights that support crowdfunding success.
○ Translate complex data into clear, actionable insights that drive strategic decisions, providing stakeholders with the necessary information to enhance business outcomes.
● Reporting & Quality Assurance:
○ Design and maintain a robust reporting framework to deliver timely insights, enhancing data reliability and ensuring stakeholders are well-informed.
○ Monitor and improve data accuracy, consistency, and integrity across all data processes, identifying and addressing areas for enhancement.
● Collaboration & Strategic Planning:
○ Work closely with Business, Product, and IT teams to align data initiatives with Ketto’s objectives and growth strategy.
○ Propose data-driven strategies that leverage AI and automation to tackle business challenges and scale impact across the platform.
○ Mentor junior data scientists and analysts, fostering a culture of data-driven decision-making.
Required Skills and Qualifications
● Technical Expertise:
○ Strong background in SQL, Statistics and Maths
● Analytical & Strategic Mindset:
○ Proven ability to derive meaningful, actionable insights from large data sets and translate findings into business strategies.
○ Experience with statistical analysis, advanced analytics
● Communication & Collaboration:
○ Exceptional written and verbal communication skills, capable of explaining complex data insights to non-technical stakeholders.
○ Strong interpersonal skills to work effectively with cross-functional teams, aligning data initiatives with organisational goals.
● Preferred Experience:
○ Proven experience in advanced analytics roles
○ Experience leading data lifecycle management, model deployment, and quality assurance initiatives.
Why Join Ketto?
At Ketto, we’re committed to driving social change through innovative data and AI solutions. As our Sr. Product Analyst, you’ll have the unique opportunity to leverage advanced data science and generative AI to shape the future of crowdfunding in Asia. If you’re passionate about using data and AI for social good, we’d love to hear from you!
Experienced Senior Functional Consultant with 7+ years of experience in Microsoft Dynamics 365 Marketing Automation.
Key Requirements:
- 7+ years of experience in MS Dynamics 365 Marketing Automation
- Expertise in Customer Segmentation, Real-Time Marketing & Personalization
- Strong experience in Email & Campaign Automation
- Hands-on knowledge of Power Automate for Marketing Workflows
- Proficiency in Data Analytics, Reporting (Customer Insights AI), and Customer Journey Orchestration
- Experience with Dynamics 365 Sales, Power BI, and Integration
- Self-management and accountability for SLAs & deliverables
Hi
Job Title: Data Visualization Engineer
Experience: 2 to 4 years
Location: Gurgaon (Hybrid)
Employment Type: Full-time
Job Description:
We are seeking a skilled Data Visualization Engineer with expertise in Qlik Sense and experience working with reporting tools like PowerBI, Tableau, Looker, and Qlik Sense. The ideal candidate will have a strong understanding of QVF and QVD structures, basic HTTP API integrations, and end-to-end data pipelines. Some knowledge of Python for data processing and automation will be a plus. This role will primarily focus on Qlik Sense reporting.
Key Responsibilities:
1. Data Visualization & Reporting
- Design, develop, and maintain interactive dashboards and reports using Qlik Sense.
- Work with PowerBI, Tableau, Looker, and Qlik Sense to create compelling data visualizations.
- Ensure seamless data representation and storytelling through dashboards.
2. Qlik Sense Development & Optimization
- Develop and manage QVF and QVD structures for optimized data retrieval.
- Implement best practices in Qlik Sense scripting, data modeling, and performance tuning.
- Maintain and optimize existing Qlik Sense applications.
3. Data Integration & API Interactions
- Utilize basic HTTP APIs to integrate external data sources into dashboards.
- Work with data teams to ensure smooth data ingestion and transformation for visualization.
4. End-to-End Data Pipeline Understanding
- Collaborate with data engineers to understand and optimize data flows from source to visualization.
- Ensure data consistency, integrity, and performance in reporting solutions.
5. Scripting & Automation
- Utilize Python for data manipulation, automation, and minor custom integrations.
- Improve reporting workflows through automation scripts and process optimizations.
Technical Expertise Required:
- 2 to 4 years of experience in Data Visualization or BI Reporting roles.
- Strong experience with Qlik Sense (QVF & QVD structures, scripting, visualization).
- Hands-on experience with PowerBI, Tableau, Looker.
- Basic understanding of HTTP APIs for data integration.
- Understanding of end-to-end data pipelines.
- Knowledge of Python for automation and data transformation.
- Experience in performance optimization of dashboards and reports.
- Strong analytical and problem-solving skills.
Preferred Qualifications:
- Experience in data modeling and ETL concepts.
- Familiarity with cloud-based data visualization solutions.
- Understanding of data governance and security best practices.
We are seeking a highly skilled and experienced Power BI Lead / Architect to join our growing team. The ideal candidate will have a strong understanding of data warehousing, data modeling, and business intelligence best practices. This role will be responsible for leading the design, development, and implementation of complex Power BI solutions that provide actionable insights to key stakeholders across the organization.
Location - Pune (Hybrid 3 days)
Responsibilities:
Lead the design, development, and implementation of complex Power BI dashboards, reports, and visualizations.
Develop and maintain data models (star schema, snowflake schema) for optimal data analysis and reporting.
Perform data analysis, data cleansing, and data transformation using SQL and other ETL tools.
Collaborate with business stakeholders to understand their data needs and translate them into effective and insightful reports.
Develop and maintain data pipelines and ETL processes to ensure data accuracy and consistency.
Troubleshoot and resolve technical issues related to Power BI dashboards and reports.
Provide technical guidance and mentorship to junior team members.
Stay abreast of the latest trends and technologies in the Power BI ecosystem.
Ensure data security, governance, and compliance with industry best practices.
Contribute to the development and improvement of the organization's data and analytics strategy.
May lead and mentor a team of junior Power BI developers.
Qualifications:
8-12 years of experience in Business Intelligence and Data Analytics.
Proven expertise in Power BI development, including DAX, advanced data modeling techniques.
Strong SQL skills, including writing complex queries, stored procedures, and views.
Experience with ETL/ELT processes and tools.
Experience with data warehousing concepts and methodologies.
Excellent analytical, problem-solving, and communication skills.
Strong teamwork and collaboration skills.
Ability to work independently and proactively.
Bachelor's degree in Computer Science, Information Systems, or a related field preferred.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Job Title: Developer
Location: [Company Location or Remote]
Job Type: [Full-time/Part-time/Contract]
Experience Level: [Entry-level/Junior/Mid-level/Senior]
Job Summary:
We are seeking a skilled Developer to design, develop, and maintain high-quality software solutions. The ideal candidate should have strong problem-solving abilities, proficiency in programming languages, and a passion for technology. You will work closely with cross-functional teams to develop scalable and efficient applications.
Key Responsibilities:
- Design, develop, test, and deploy software applications.
- Write clean, efficient, and well-documented code.
- Collaborate with designers, product managers, and other developers.
- Troubleshoot and debug applications to optimize performance.
- Stay updated with emerging technologies and industry trends.
- Participate in code reviews and provide constructive feedback.
- Integrate third-party APIs and databases as needed.
- Ensure software security, scalability, and maintainability.
Required Skills & Qualifications:
- Bachelor’s/Master’s degree in Computer Science, Engineering, or a related field.
- Proficiency in [mention relevant programming languages, e.g., Python, Java, JavaScript, C++].
- Experience with [mention frameworks, e.g., React, Angular, Django, Flask, Spring Boot].
- Knowledge of databases such as [MySQL, PostgreSQL, MongoDB].
- Familiarity with version control systems like Git and GitHub.
- Strong problem-solving and analytical skills.
- Excellent teamwork and communication skills.
- Ability to work in an agile development environment.
Preferred Qualifications (if applicable):
- Experience in cloud technologies like AWS, Azure, or Google Cloud.
- Knowledge of DevOps practices and CI/CD pipelines.
- Experience with containerization tools like Docker and Kubernetes.
- Understanding of AI/ML concepts (for AI-related roles).
Benefits:
- Competitive salary and performance-based bonuses.
- Flexible work hours and remote work options.
- Health insurance and wellness programs.
- Career development and learning opportunities.
- Friendly and collaborative work culture.
Job Title : Senior AWS Data Engineer
Experience : 5+ Years
Location : Gurugram
Employment Type : Full-Time
Job Summary :
Seeking a Senior AWS Data Engineer with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.
Key Responsibilities :
- Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
- Maintain data lakes & warehouses for analytics.
- Ensure data integrity through quality checks.
- Collaborate with data scientists & engineers to deliver solutions.
Qualifications :
- 7+ Years in Data Engineering.
- Expertise in AWS services, SQL, Python, Spark, Kafka.
- Experience with CI/CD, DevOps practices.
- Strong problem-solving skills.
Preferred Skills :
- Experience with Snowflake, Databricks.
- Knowledge of BI tools (Tableau, Power BI).
- Healthcare/Insurance domain experience is a plus.
Job Opening: ERP Developer – Noida/Gurgaon
📍 Location: Noida/Gurgaon
💼 Experience: 5-8 Years
We are looking for an ERP Developer with expertise in D365 Finance and Operations to join our team. If you have hands-on experience in Microsoft Dynamics AX 2012 R3, Power BI, Power Apps, MS SQL Server, and SSRS reports, we want to hear from you!
🔹 Key Skills & Expertise:
✅ D365 Finance & Operations (Finance Consultant Role)
✅ Microsoft Dynamics AX 2012 R3 / D365 Technical Development
✅ Power BI & Power Apps Platform
✅ MS SQL Server & SSRS Reports
✅ 24x7 ERP Support & Implementation