50+ PowerBI Jobs in India
Apply to 50+ PowerBI Jobs on CutShort.io. Find your next job, effortlessly. Browse PowerBI Jobs and apply today!

!!Hiring!! Business Analyst Interns ( unpaid )!!!
💻 Full-Time Remote Internship | Unpaid | Duration: 6 Months
📄 Certification of Completion | 5 Days Working (Mon–Fri)
🕛 Shift Timing: 12:00 PM to 9:00 PM
📝 Requirements:
• Basic knowledge of business analysis concepts
• Comfortable with Excel, Google Sheets & documentation
• Strong analytical skills & problem-solving mindset
• Good communication and writing ability
• Self-motivated and eager to learn
🎁 What You’ll Gain:
• Real-time learning experience in a tech company
• Exposure to structured business and data processes
• Internship Completion Certificate
• Potential for full-time employment based on performance
Job Title: Financial Business Analyst
Location: Trichy/Bangalore
Employment Type: Full-Time
Department: Finance / Strategy
Reporting To: CFO
Job Summary:
We are seeking a highly analytical and detail-oriented Financial Business Analyst to join our finance team. The ideal candidate will have a solid foundation in financial analysis, business planning, and advanced Excel modeling. You will play a key role in driving strategic decisions by analyzing business performance, identifying trends, and building financial models to support forecasting and investment initiatives.
Key Responsibilities:
· Analyze historical financial data and business performance to identify trends, risks, and opportunities
· Prepare detailed models on cash flow, vendor and bank payments, receivables, and portfolio analytics
· Collaborate with cross-functional teams to gather data, validate assumptions, and support operational planning
· Develop dashboards and reports to monitor key performance indicators (KPIs) and provide actionable insights
· Perform variance analysis comparing actual vs. budget/forecast and explain business drivers
· Support strategic initiatives, including business case development, cost optimization, and capital investment analysis
· Prepare executive presentations and provide concise summaries of financial findings
· Assist in improving financial systems, processes, and reporting efficiency
· Prepare detailed reports and dashboards for other functional teams, partners and senior leadership
Qualifications:
· Bachelor's degree in Finance, Accounting, Economics, or related field (MBA or CFA is a plus)
· 3–5 years of experience in financial/business analysis, preferably in a corporate or consulting environment
· Expert-level proficiency in Microsoft Excel (VLOOKUP, INDEX-MATCH, pivot tables, nested formulas, scenario analysis, macros a plus)
· Strong understanding of financial statements, modeling logic, and investment metrics (NPV, IRR, ROI, etc.)
· Excellent communication and presentation skills with the ability to translate complex data into clear insights
· Proficiency in PowerPoint; familiarity with BI tools (Power BI, Tableau) or ERP systems is a plus
· Strong attention to detail and ability to manage multiple priorities in a fast-paced environment
Preferred Traits:
· Highly analytical with a passion for solving business problems
· Self-starter who takes initiative and ownership of projects
· Comfortable working with ambiguity and building structure from scratch
· Collaborative, curious, and motivated to drive financial excellence



We are seeking a detail-oriented and analytical Data Analyst to collect, process, and analyze data to help drive informed business decisions. The ideal candidate will have strong technical skills, business acumen, and the ability to communicate insights effectively.

We are looking for an experienced and dynamic to join our team. The ideal candidate will be responsible for designing, developing, and delivering high-quality technical training programs to students.


We are looking for a dynamic and skilled Business Analyst Trainer with 2 to 5 years of hands-on industry and/or teaching experience. The ideal candidate should be able to simplify complex data concepts, mentor aspiring professionals, and deliver effective training programs in Business Analyst, Power BI, Tableau, Machine learning
We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.
Job Description :
We are seeking a highly experienced Sr Data Modeler / Solution Architect to join the Data Architecture team at Corporate Office in Bangalore. The ideal candidate will have 4 to 8 years of experience in data modeling and architecture with deep expertise in AWS cloud stack, data warehousing, and enterprise data modeling tools. This individual will be responsible for designing and creating enterprise-grade data models and driving the implementation of Layered Scalable Architecture or Medallion Architecture to support robust, scalable, and high-quality data marts across multiple business units.
This role will involve managing complex datasets from systems like PoS, ERP, CRM, and external sources, while optimizing performance and cost. You will also provide strategic leadership on data modeling standards, governance, and best practices, ensuring the foundation for analytics and reporting is solid and future-ready.
Key Responsibilities:
· Design and deliver conceptual, logical, and physical data models using tools like ERWin.
· Implement Layered Scalable Architecture / Medallion Architecture for building scalable, standardized data marts.
· Optimize performance and cost of AWS-based data infrastructure (Redshift, S3, Glue, Lambda, etc.).
· Collaborate with cross-functional teams (IT, business, analysts) to gather data requirements and ensure model alignment with KPIs and business logic.
· Develop and optimize SQL code, materialized views, stored procedures in AWS Redshift.
· Ensure data governance, lineage, and quality mechanisms are established across systems.
· Lead and mentor technical teams in an Agile project delivery model.
· Manage data layer creation and documentation: data dictionary, ER diagrams, purpose mapping.
· Identify data gaps and availability issues with respect to source systems.
Required Skills & Qualifications:
· Bachelor’s or Master’s degree in Computer Science, IT, or related field (B.E./B.Tech/M.E./M.Tech/MCA).
· Minimum 4 years of experience in data modeling and architecture.
· Proficiency with data modeling tools such as ERWin, with strong knowledge of forward and reverse engineering.
· Deep expertise in SQL (including advanced SQL, stored procedures, performance tuning).
· Strong experience in data warehousing, RDBMS, and ETL tools like AWS Glue, IBM DataStage, or SAP Data Services.
· Hands-on experience with AWS services: Redshift, S3, Glue, RDS, Lambda, Bedrock, and Q.
· Good understanding of reporting tools such as Tableau, Power BI, or AWS QuickSight.
· Exposure to DevOps/CI-CD pipelines, AI/ML, Gen AI, NLP, and polyglot programming is a plus.
· Familiarity with data governance tools (e.g., ORION/EIIG).
· Domain knowledge in Retail, Manufacturing, HR, or Finance preferred.
· Excellent written and verbal communication skills.
Certifications (Preferred):
· AWS Certification (e.g., AWS Certified Solutions Architect or Data Analytics – Specialty)
· Data Governance or Data Modeling Certifications (e.g., CDMP, Databricks, or TOGAF)
Mandatory Skills
aws, Technical Architecture, Aiml, SQL, Data Warehousing, Data Modelling
Job Description:
We are seeking a skilled Power BI Developer with a strong understanding of Capital Markets to join our data analytics team. The ideal candidate will be responsible for designing, developing, and maintaining interactive dashboards and reports that provide insights into trading, risk, and financial performance. This role requires experience working with capital market data sets and a solid grasp of financial instruments and market operations.
Key Responsibilities:
- Develop interactive Power BI dashboards and reports tailored to capital markets (e.g., equities, derivatives, fixed income).
- Connect to and integrate data from various sources such as Bloomberg, Reuters, SQL databases, and Excel.
- Translate business requirements into data models and visualizations that provide actionable insights.
- Optimize Power BI reports for performance, usability, and scalability.
- Work closely with business stakeholders (trading, risk, compliance) to understand KPIs and analytics needs.
- Implement row-level security and data access controls.
- Maintain data quality, lineage, and versioning documentation.
Required Skills & Qualifications:
- 3+ years of experience with Power BI (Power Query, DAX, data modeling).
- Strong understanding of capital markets: trading workflows, market data, instruments (equities, bonds, derivatives, etc.).
- Experience with SQL and working with large financial datasets.
- Familiarity with risk metrics, trade lifecycle, and financial statement analysis.
- Knowledge of data governance, security, and performance tuning in BI environments.
- Excellent communication skills and ability to work with cross-functional teams.
Preferred Qualifications:
- Experience with Python or R for data analysis.
- Knowledge of investment banking or asset management reporting frameworks.
- Exposure to cloud platforms like Azure, AWS, or GCP.
- Certifications in Power BI or Capital Markets.

Only candidates currently in Bihar or Open to relocate to Bihar, please apply:
Job Description:
This is an exciting opportunity for an experienced industry professional with strong analytical and technical skills to join and add value to a dedicated and friendly team. We are looking for a Data Analyst who is driven by data-driven decision-making and insights. As a core member of the Analytics Team, the candidate will take ownership of data analysis projects by working independently with little supervision.
The ideal candidate is a highly resourceful and innovative professional with extensive experience in data analysis, statistical modeling, and data visualization. The candidate must have a strong command of data analysis tools like SAS/SPSS, Power BI/Tableau, or R, along with expertise in MS Excel and MS PowerPoint. The role requires optimizing data collection procedures, generating reports, and applying statistical techniques for hypothesis testing and data interpretation.
Key Responsibilities:
• Perform data analysis using tools like SAS, SPSS, Power BI, Tableau, or R.
• Optimize data collection procedures and generate reports on a weekly, monthly, and quarterly basis.
• Utilize statistical techniques for hypothesis testing to validate data and interpretations.
• Apply data mining techniques and OLAP methodologies for in-depth insights.
• Develop dashboards and data visualizations to present findings effectively.
• Collaborate with cross-functional teams to define, design, and execute data-driven strategies.
• Ensure the accuracy and integrity of data used for analysis and reporting.
• Utilize advanced Excel skills to manipulate and analyze large datasets.
• Prepare technical documentation and presentations for stakeholders.
Candidate Profile:
Required Qualifications:
• Qualification: MCA / Graduate / Post Graduate in Statistics or MCA or BE/B.Tech in Computer Science & Engineering, Information Technology, or Electronics.
• A minimum of 2 years' experience in data analysis using SAS/SPSS, Power BI/Tableau, or R.
• Proficiency in MS Office with expertise in MS Excel & MS PowerPoint.
• Strong analytical skills with attention to detail.
• Experience in data mining and OLAP methodologies.
• Ability to generate insights and reports based on data trends.
• Excellent communication and presentation skills.
Desired Qualifications:
• Experience in predictive analytics and machine learning techniques.
• Knowledge of SQL and database management.
• Familiarity with Python for data analysis.
• Experience in automating reporting processes.
We are seeking a detail-oriented and analytical Business Analyst to bridge the gap between business needs and technology solutions. The ideal candidate will be responsible for analyzing business processes, identifying improvement areas, and supporting data-driven decision-making through insights and documentation.


We are seeking a passionate and knowledgeable Data Science and Data Analyst Trainer to deliver engaging and industry-relevant training programs. The trainer will be responsible for teaching core concepts in data analytics, machine learning, data visualization, and related tools and technologies. The ideal candidate will have hands-on experience in the data domain with 2-5 years and a flair for teaching and mentoring students or working professionals.


We are looking for a dynamic and skilled Data Science and Data Analyst Trainer with 2 to 5 years of hands-on industry and/or teaching experience. The ideal candidate should be able to simplify complex data concepts, mentor aspiring professionals, and deliver effective training programs in data analytics, data science, and business intelligence tools.


We are seeking a dynamic and experienced Data Analytics and Data Science Trainer to deliver high-quality training sessions, mentor learners, and design engaging course content. The ideal candidate will have a strong foundation in statistics, programming, and data visualization tools, and should be passionate about teaching and guiding aspiring professionals.

Job Title: Manager - Retail Analyst and Communication (Apparel Retail)
Experience: 6-8 years in Retail Business Management Information System
Location: Gurgaon
Salary: Negotiable
Industry: Retail/ Apparel/ Fashion
This role is responsible for managing the end-to-end Retail Business Management Information System (MIS). It involves handling internal communication channels, ensuring adherence to reporting processes and implementing weekly actions based on data insights.
Key Deliverables (Essential functions & Responsibilities of the Job):
· Own and manage all aspects of the MIS and data systems, including report and dashboard generation.
· Coordinate with cross-functional teams for data collection, reporting, and timely store projects.
· Lead monthly, quarterly, and annual target setting for sales and key retail performance metrics.
· Monitor weekly, monthly, and seasonal performance, and circulate actionable insights and reports to stores and leadership.
· Deliver weekly retail communications, ensuring clarity on actions required.
· Support ad hoc data requirements with advanced Excel capabilities.
· Work with tools like SAP, OneDrive, Tableau, Power BI, etc.
Key Skills Required:
· Strong multitasking and time management skills.
· Self-driven, organized, and goal-oriented.
· Analytical mindset with the ability to convert data into actionable insights.
· Proficiency in Microsoft Office Suite, particularly Excel; familiarity with data visualization tools preferred.
mail updated resume with current salary-
email: etalenthire[at]gmail[dot]com
satish: 88 O2 74 97 43
website: www.glansolutions.com

We’re seeking a detail-oriented Data Analyst with proven experience in corporate settings to transform raw data into actionable insights. You’ll collaborate across departments to support strategic decision-making, optimize operations, and enhance business performance through data-driven analysis.

We are looking for a skilled and detail-oriented Data Analyst – Data Scientist (DA-DS) to join our team. This hybrid role involves analyzing large datasets to extract insights, build predictive models, and support data-driven decision-making. You’ll work closely with cross-functional teams to transform raw data into actionable insights using statistical techniques, data visualization, and machine learning tools.
Immediate Hiring for Business Analyst
Position: Business Analyst
Experiance : 5 - 8 Years
Location:Hyderabad
Job Summary:
We are seeking a motivated and detail-oriented Business Analyst with 5 years of experience in the Travel domain. The ideal candidate will have a strong understanding of the travel industry, including airlines, travel agencies, and online booking systems. You will work closely with cross-functional teams to gather business requirements, analyze processes, and deliver solutions that improve customer experience and operational efficiency.
Key Responsibilities:
- Requirement Gathering & Analysis: Collaborate with stakeholders to gather, document, and analyze business requirements, ensuring alignment with business goals.
- Process Improvement: Identify opportunities for process improvement and optimization in travel booking, ticketing, and customer support systems.
- Stakeholder Communication: Act as the bridge between the business stakeholders and technical teams, ensuring clear communication of requirements, timelines, and deliverables.
- Solution Design: Participate in the design and development of solutions, collaborating with IT and development teams to ensure business needs are met.
- Data Analysis: Analyze data related to customer journeys, bookings, and cancellations to identify trends and insights for decision-making.
- Documentation: Prepare detailed documentation including business requirements documents (BRD), user stories, process flows, and functional specifications.
- Testing & Validation: Support testing teams during User Acceptance Testing (UAT) to ensure solutions meet business needs, and facilitate issue resolution.
- Market Research: Stay up to date with travel industry trends, customer preferences, and competitor offerings to ensure innovative solutions are delivered.
Qualifications & Skills:
- Education: Bachelor’s degree in Business Administration, Information Technology, or a related field.
- Experience:
- 5 years of experience as a Business Analyst in the travel industry.
- Hands-on experience in working with travel booking systems (GDS, OTA) is highly preferred.
- Domain Knowledge:
- Strong understanding of the travel industry, including booking engines, reservations, ticketing, cancellations, and customer support.
- Familiarity with industry-specific regulations and best practices.
- Analytical Skills: Excellent problem-solving skills with the ability to analyze complex data and business processes.
- Technical Skills:
- Proficiency in Microsoft Office (Word, Excel, PowerPoint).
- Knowledge of SQL or data visualization tools (Power BI, Tableau) is a plus.
- Communication: Strong verbal and written communication skills with the ability to convey complex information clearly.
- Attention to Detail: Strong focus on accuracy and quality of work, ensuring that solutions meet business requirements.
Preferred:
- Prior experience with Agile methodologies.
- Certification in Business Analysis (CBAP or similar).

We are seeking a highly motivated and knowledgeable DADS Trainer to conduct hands-on training in Data Analytics and Data Science. The ideal candidate will have strong domain expertise, coding proficiency, and a passion for teaching concepts in Python, statistics, machine learning, data visualization, and tools like Excel, Power BI, and SQL.
Roles and Responsibilities:
The purpose of the role is to provide timely, accurate and quality MIS reports, dashboards to the external & internal stakeholders of account(s) as per the defined process and standards of security and compliance.
• Prepare timely and accurate MIS reports and dashboards as required by the stakeholders
• Interact and work closely with management, internal stakeholders & clients to understand the business information needs
• Ensuring all reports & dashboards are prepared as per stakeholder requirements as per the desired frequency (weekly/ monthly/quarterly)
• Ensure regular review with the MIS Team Lead for 100% accuracy before populating any customized dashboard or generating any customized report
• Track and follow up with relevant stakeholders for timely updation and data management of parameters (key SLA metrics such as run-rate etc.) • Generate account level reports (billable and non-billable) on forecasting, scheduling (both onshore and offshore) and performance against SLAs, CSAT, Quality etc.
• Ensure zero non-compliances on process audit on data security and compliance
• Support and adopt tools and systems for efficient MIS generation and reporting system
• Continuous support to the manager in rolling out new techniques and initiatives to increase productivity
• Providing update to the manager on the progress of any new MIS initiatives
• Perform periodic maintenance and servicing of MIS system to improve operational efficiency
• Adopt new tools, technology solutions and develop capability through training to improve his own productivity.
• Develop analytical skills and understanding of statistical analysis to suggest improvement in the quality of analysis
• Stakeholder management
• Coordinate with internal and external stakeholders for collation and accuracy of data
• Provide timely assistance in case of an escalation and support resolution of escalations/ issues
Additional Skills:
● Should have good knowledge of MS Excel and hands on experience in making reports
● Analytical Skills
● Data Visualization Skill
● Advance Knowledge of MS-Office/ Office 365
Role descriptions / Expectations from the Role
Develop, deploy, and maintain Power Platform solutions, including canvas apps, model-driven apps, and flows.
Collaborate with cross-functional teams to define, design, and ship new features.
Troubleshoot and resolve issues related to Power Platform applications.
Ensure the security, scalability, and reliability of Power Platform solutions.
You will be joining a fast-growing product team within the End User Technology Services organisation. You will be part of the build out of the capability in the region and you will liaise with our customers across all the LEADING BANK business lines



About the Role:
We are looking for a Senior Technical Customer Success Manager to join our growing team. This is a client-facing role focused on ensuring successful adoption and value realization of our SaaS solutions. The ideal candidate will come from a strong analytics background, possess hands-on skills in SQL and Python or R, and have experience working with dashboarding tools. Prior experience in eCommerce or retail domains is a strong plus.
Responsibilities:
- Own post-sale customer relationship and act as the primary technical point of contact.
- Drive product adoption and usage through effective onboarding, training, and ongoing support.
- Work closely with clients to understand business goals and align them with product capabilities.
- Collaborate with internal product, engineering, and data teams to deliver solutions and enhancements tailored to client needs.
- Analyze customer data and usage trends to proactively identify opportunities and risks.
- Build dashboards or reports for customers using internal tools or integrations.
- Lead business reviews, share insights, and communicate value delivered.
- Support customers in configuring rules, data integrations, and troubleshooting issues.
- Drive renewal and expansion by ensuring customer satisfaction and delivering measurable outcomes.
Requirements:
- 7+ years of experience in a Customer Success, Technical Account Management, or Solution Consulting role in a SaaS or software product company.
- Strong SQL skills and working experience with Python or R.
- Experience with dashboarding tools such as Tableau, Power BI, Looker, or similar.
- Understanding of data pipelines, APIs, and data modeling.
- Excellent communication and stakeholder management skills.
- Proven track record of managing mid to large enterprise clients.
- Experience in eCommerce, retail, or consumer-facing businesses is highly desirable.
- Ability to translate technical details into business context and vice versa.
- Bachelor’s or Master’s degree in Computer Science, Analytics, Engineering, or related field.
Nice to Have:
- Exposure to machine learning workflows, recommendation systems, or pricing analytics.
- Familiarity with cloud platforms (AWS/GCP/Azure).
- Experience working with cross-functional teams in Agile environments.

Senior Data Engineer Job Description
Overview
The Senior Data Engineer will design, develop, and maintain scalable data pipelines and
infrastructure to support data-driven decision-making and advanced analytics. This role requires deep
expertise in data engineering, strong problem-solving skills, and the ability to collaborate with
cross-functional teams to deliver robust data solutions.
Key Responsibilities
Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data
pipelines to ingest, process, and transform large volumes of structured and unstructured data.
Data Architecture: Architect and maintain data storage solutions, including data lakes, data
warehouses, and databases, ensuring performance, scalability, and cost-efficiency.
Data Integration: Integrate data from diverse sources, including APIs, third-party systems,
and streaming platforms, ensuring data quality and consistency.
Performance Optimization: Monitor and optimize data systems for performance, scalability,
and cost, implementing best practices for partitioning, indexing, and caching.
Collaboration: Work closely with data scientists, analysts, and software engineers to
understand data needs and deliver solutions that enable advanced analytics, machine
learning, and reporting.
Data Governance: Implement data governance policies, ensuring compliance with data
security, privacy regulations (e.g., GDPR, CCPA), and internal standards.
Automation: Develop automated processes for data ingestion, transformation, and validation
to improve efficiency and reduce manual intervention.
Mentorship: Guide and mentor junior data engineers, fostering a culture of technical
excellence and continuous learning.
Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high
availability and reliability of data systems.
Required Qualifications
Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,
or a related field.
Experience: 5+ years of experience in data engineering or a related role, with a proven track
record of building scalable data pipelines and infrastructure.
Technical Skills:
Proficiency in programming languages such as Python, Java, or Scala.
Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).
Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services
(e.g., Redshift, BigQuery, Snowflake).
Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and
data integration frameworks.
Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed
systems.
Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a
plus.
Soft Skills:
Excellent problem-solving and analytical skills.
Strong communication and collaboration abilities.
Ability to work in a fast-paced, dynamic environment and manage multiple priorities.
Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,
Google Professional Data Engineer) or relevant data engineering certifications.
Preferred Qualifica
Experience with real-time data processing and streaming architectures.
Familiarity with machine learning pipelines and MLOps practices.
Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data
pipelines.
Experience in industries with high data complexity, such as finance, healthcare, or
e-commerce.
Work Environment
Location: Hybrid/Remote/On-site (depending on company policy).
Team: Collaborative, cross-functional team environment with data scientists, analysts, and
business stakeholders.
Hours: Full-time, with occasional on-call responsibilities for critical data systems.
Job Summary:
Position : Senior Power BI Developer
Experience : 4+Years
Location : Ahmedabad - WFO
Key Responsibilities:
- Design, develop, and maintain interactive and user-friendly Power BI dashboards and
- reports.
- Translate business requirements into functional and technical specifications.
- Perform data modeling, DAX calculations, and Power Query transformations.
- Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs.
- Optimize Power BI datasets, reports, and dashboards for performance and usability.
- Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy
- and relevance.
- Ensure security and governance best practices in Power BI workspaces and datasets.
- Provide ongoing support and troubleshooting for existing Power BI solutions.
- Stay updated with Power BI updates, best practices, and industry trends.
Required Skills & Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or a
- related field.
- 4+ years of professional experience in data analytics or business intelligence.
- 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service).
- Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema).
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Experience in working with large and complex datasets.
- Experience in BigQuery, MySql, Looker Studio is a plus.
- Ecommerce Industry Experience will be an added advantage.
- Solid understanding of data warehousing concepts and ETL processes.
- Experience with version control tools such as Power Apps & Power Automate would be a plus.
Preferred Qualifications:
- Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse).
- Knowledge of other BI tools (Tableau, Qlik) is a plus.
- Familiarity with scripting languages (Python, R) for data analysis is a bonus.
- Experience integrating Power BI into web portals using Power BI Embedded.

Description
Job Description
Founded in 2015, GreenTree Group is a cutting-edge real estate investment and asset management firm based in China. At GreenTree, we focus on acquiring high-quality, low-cost assets at significant discounts and transforming them through repositioning, leasing, and reselling. Leveraging our expertise and deep local networks, we consistently identify and unlock hidden value in overlooked opportunities using innovative, technology-driven methods.
Job Highlights
As a Data Science Analyst Intern for Microsoft Power Automate Projects, you will bridge the gap between data analytics and automation. You will play a pivotal role in designing, developing, and refining automated data workflows that enhance our business processes. By collaborating with cross-functional teams, you will help drive operational efficiency and data-driven decision-making across the organization—all while gaining real-world experience in data science and process automation.
Responsibilities
Design & Development: Work on building and optimizing automation workflows using Microsoft Power Automate to streamline critical business processes.
Data Integration: Collect and integrate data from various sources such as databases, APIs, spreadsheets, and web services to support analytic initiatives.
Analysis & Reporting: Utilize tools like Excel, SQL, and Python to analyze data, identify trends, and generate actionable insights that inform strategic decisions.
Collaboration: Partner with data, operations, and engineering teams to identify automation opportunities that improve efficiency and reduce manual effort.
Documentation: Develop process documentation, including detailed workflow diagrams, to ensure clarity and repeatability of automation processes.
Continuous Improvement: Assist in testing, deployment, and iterative refinement of automated solutions to maintain optimal performance and reliability.
What We Offer
Comprehensive Training: Access to extensive training materials and support to enhance your technical and professional skills.
Mentorship: Regular feedback and guidance sessions with supervisors and directors to foster your growth in data analytics and process automation.
Career Development: Opportunities for increased responsibility and project leadership based on performance and results.
Real-World Impact: Direct involvement in high-impact projects that drive operational improvements and technological innovation.
Requirements
Educational Background: Pursuing or holding a degree in Data Science, Computer Science, Information Systems, or a related field.
Technical Skills: Proficiency in data analysis using Excel; familiarity with SQL, Python, or other programming languages is preferred.
Automation Exposure: Exposure to Microsoft Power Automate, Power Apps, or similar RPA tools is an asset.
Analytical Mindset: Strong logical reasoning and problem-solving capabilities, with an eagerness to learn and explore new solutions.
Communication: Excellent verbal and written communication skills, with the ability to clearly present complex information.
Team Player: A proactive and collaborative approach to working within cross-functional teams.
About GreenTree GreenTree Group is dedicated to delivering innovative solutions in real estate investment and asset management. Our mission is to build lasting relationships with our investors by utilizing advanced technology to optimize processes and deliver exceptional value. We pride ourselves on our commitment to excellence, creativity, and integrity in every project we undertake.
Location Shanghai, China
Start your career with GreenTree and gain invaluable experience at the intersection of data science, automation, and real estate operations!
Job Type: Full-time
Pay: $50 per month
Work Location: Remote

Job Purpose
Responsible for managing end-to-end database operations, ensuring data accuracy, integrity, and security across systems. The position plays a key role in driving data reliability, availability, and compliance with operational standards.
Key Responsibilities:
- Collate audit reports from the QA team and structure data in accordance with Standard Operating Procedures (SOP).
- Perform data transformation and validation for accuracy and consistency.
- Upload processed datasets into SQL Server using SSIS packages.
- Monitor and optimize database performance, identifying and resolving bottlenecks.
- Perform regular backups, restorations, and recovery checks to ensure data continuity.
- Manage user access and implement robust database security policies.
- Oversee database storage allocation and utilization.
- Conduct routine maintenance and support incident management, including root cause analysis and resolution.
- Design and implement scalable database solutions and architecture.
- Create and maintain stored procedures, views, and other database components.
- Optimize SQL queries for performance and scalability.
- Execute ETL processes and support seamless integration of multiple data sources.
- Maintain data integrity and quality through validation and cleansing routines.
- Collaborate with cross-functional teams on data solutions and project deliverables.
Educational Qualification: Any Graduate
Required Skills & Qualifications:
- Proven experience with SQL Server or similar relational database platforms.
- Strong expertise in SSIS, ETL processes, and data warehousing.
- Proficiency in SQL/T-SQL, including scripting, performance tuning, and query optimization.
- Experience in database security, user role management, and access control.
- Familiarity with backup/recovery strategies and database maintenance best practices.
- Strong analytical skills with experience working with large and complex datasets.
- Solid understanding of data modeling, normalization, and schema design.
- Knowledge of incident and change management processes.
- Excellent communication and collaboration skills.
- Experience with Python for data manipulation and automation is a strong plus.

Location: Mumbai
Job Type: Full-Time (Hybrid – 3 days in office, 2 days WFH)
Job Overview:
We are looking for a skilled Azure Data Engineer with strong experience in data modeling, pipeline development, and SQL/Spark expertise. The ideal candidate will work closely with the Data Analytics & BI teams to implement robust data solutions on Azure Synapse and ensure seamless data integration with third-party applications.
Key Responsibilities:
- Design, develop, and maintain Azure data pipelines using Azure Synapse (SQL dedicated pools or Apache Spark pools).
- Implement data models in collaboration with the Data Analytics and BI teams.
- Optimize and manage large-scale SQL and Spark-based data processing solutions.
- Ensure data availability and reliability for third-party application consumption.
- Collaborate with cross-functional teams to translate business requirements into scalable data solutions.
Required Skills & Experience:
3–5 years of hands-on experience in:
- Azure data services
- Data Modeling
- SQL development and tuning
- Apache Spark
- Strong knowledge of Azure Synapse Analytics.
- Experience in designing data pipelines and ETL/ELT processes.
- Ability to troubleshoot and optimize complex data workflows.
Preferred Qualifications:
- Experience with data governance, security, and data quality practices.
- Familiarity with DevOps practices in a data engineering context.
- Effective communication skills and the ability to work in a collaborative team environment.

About the Role:
We are looking for a skilled and detail-oriented Data Analyst to join our team. The ideal candidate will be responsible for collecting, analyzing, and interpreting large datasets to support data-driven decision-making across the organization. Proficiency in MongoDB and SQL is essential for this role.
Key Responsibilities:
- Collect, process, and clean structured and unstructured data from various sources.
- Analyze data using SQL queries and MongoDB aggregations to extract insights.
- Develop and maintain dashboards, reports, and visualizations to present data in a meaningful way.
- Collaborate with cross-functional teams to identify business needs and provide data-driven solutions.
- Monitor data quality and integrity, ensuring accuracy and consistency.
- Support the development of predictive models and data pipelines.
Required Skills & Qualifications:
- Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field.
- Proven experience as a Data Analyst or similar role.
- Strong proficiency in SQL for data querying and manipulation.
- Hands-on experience with MongoDB, including working with collections, documents, and aggregations.
- Knowledge of data visualization tools such as Tableau, Power BI, or similar (optional but preferred).
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder management abilities.
Good to Have:
- Experience with Python/R for data analysis.
- Exposure to ETL tools and data warehousing concepts.
- Understanding of statistical methods and A/B testing.
Job Title: Tableau BI Developer
Years of Experience: 4-8Yrs
12$ per hour fte engagement
8 hrs. working
Required Skills & Experience:
✅ 4–8 years of experience in BI development and data engineering
✅ Expertise in BigQuery and/or Snowflake for large-scale data processing
✅ Strong SQL skills with experience writing complex analytical queries
✅ Experience in creating dashboards in tools like Power BI, Looker, or similar
✅ Hands-on experience with ETL/ELT tools and data pipeline orchestration
✅ Familiarity with cloud platforms (GCP, AWS, or Azure)
✅ Strong understanding of data modeling, data warehousing, and analytics best practices
✅ Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders
Job Title : SAP BW/HANA Consultant with ABAP & Power BI
Total Experience : 8+ Years
Relevant Experience : 6+ Years
Location : Taramani, Chennai
Interview Mode : Virtual
Notice Period: Immediate Joiners Only
Job Summary :
We are looking for a seasoned professional with a strong background in Data Warehousing and hands-on experience with SAP BW/HANA. The ideal candidate should possess expertise in ABAP programming, have working knowledge or exposure to Power BI, and show interest in learning cloud technologies.
Mandatory Skills : Data Warehousing, SAP BW, SAP HANA, ABAP Programming, Power BI (basic knowledge), Cloud Technologies (willingness to learn).
Key Responsibilities :
- Design, develop, and support data warehouse solutions using SAP BW and HANA.
- Write efficient ABAP programs to support custom functionality within SAP environments.
- Integrate data and visualizations using Power BI (working knowledge acceptable).
- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions.
- Stay updated with cloud technology trends and show willingness to adapt and learn new tools and platforms.
Mandatory Skills :
- Strong knowledge of Data Warehousing.
- Hands-on experience with SAP BW and SAP HANA.
- Proficiency in ABAP programming.
- Exposure or experience with Power BI.
- Interest or basic exposure to cloud technologies.

We are seeking a passionate and experienced Data Analyst Trainer to design, develop, and deliver training content for aspiring or existing data professionals. The trainer will be responsible for teaching core data analytics skills, tools, and industry practices to ensure trainees are job-ready or upskilled.

AccioJob is conducting an offline hiring drive in partnership with Our Partner Company to hire Junior Business/Data Analysts for an internship with a Pre-Placement Offer (PPO) opportunity.
Apply, Register and select your Slot here: https://go.acciojob.com/69d3Wd
Job Description:
- Role: Junior Business/Data Analyst (Internship + PPO)
- Work Location: Hyderabad
- Internship Stipend: 15,000 - 25,000/month
- Internship Duration: 3 months
- CTC on PPO: 5 LPA - 6 LPA
Eligibility Criteria:
- Degree: Open to all academic backgrounds
- Graduation Year: 2023, 2024, 2025
Required Skills:
- Proficiency in SQL, Excel, Power BI, and basic Python
- Strong analytical mindset and interest in solving business problems with data
Hiring Process:
- Offline Assessment at AccioJob Skill Centres (Hyderabad, Pune, Noida)
- 1 Assignment + 2 Technical Interviews (Virtual; In-person for Hyderabad candidates)
Note: Please bring your laptop and earphones for the test.
Register Here: https://go.acciojob.com/69d3Wd
Job Role - Power BI Lead
9 to 12 Years Experience Required.
Location - Pune Baner/ Vimaan Nagar
Work Model - Hybrid (Wednesday and Thursday WFO) 12 PM to 9 PM
Experience with Banking or GRC Domain is preferred.
- JOB SUMMARY
- Role Overview: We are seeking a highly skilled Power BI Expert to design, develop, implement and governance of Power BI solutions. Ideal candidate will have in-depth knowledge of Power BI architecture, data modeling, governance, embedded analytics and database management. The role requires expertise Power BI Data Gateways, report deployment, and governance frameworks ensuring scalable and secure data solutions.
- PRIMARY RESPONSIBILITIES
- Power BI Lead & Implementation:
- Design, develop, and deploy interactive Power BI reports and dashboards.
- Create efficient data models to optimize performance and scalability.
- Develop complex DAX expressions for business logic and calculations.
- Optimize report performance by using best practices in Power BI and SQL
- Power BI Architecture & Configuration:
- Configure and manage Power BI Data Gateways for secure and seamless data access
- Define and enforce Power BI workspace, dataset, and security policies.
- Implement row-level security (RLS) and data governance best practices.
- Establish data refresh schedules and ensure efficient data ingestion pipelines.
- Maintain and enhance Power BI Premium and Embedded solutions.
- Embedded Analytics & Integration:
- Integrate Power BI reports with external applications using Power BI Embedded.
- Work with Power BI REST APIs to automate workflows.
- Integrate Power BI with Oracle, SQL Server, MySQL, Microsoft Share point, Excel, Cloud data source etc.,
- Database & Performance Optimization:
- Write optimized SQL queries and stored procedures for report development.
- Ensure high-performance data refreshes and query execution.
- Work with ETL team to improve data integration with PowerBI
- Governance & Security:
- Define Power BI governance framework and best practices for standardization.
- Monitor user access, performance, and usage analytics to drive efficiency.
- Manage user roles, access controls, and data security.
- PowerApps & Power Automate (Nice to Have):
- Build PowerApps applications to extend Power BI functionality and create
- interactive business solutions
- Automate data flows and reporting updates using Power Automate (Flows,
- Triggers, Approvals, Notifications, etc.).
- Integrate Power BI, PowerApps, and Power Automate to create end-to-end
- business process automation.
- Stakeholder Collaboration
- Training:
- Work closely with business users, data engineers, and leadership teams to understand and document reporting requirements.
- Provide training and best practice guidance to Power BI users across the organization.
- Develop self-service Power BI frameworks to empower business teams for reporting.
- Troubleshoot Power BI performance and user issues.

This is a fulltime onsite trainer cum developer role. To prepare placement students with the technical knowledge, skills, and confidence required to succeed in campus recruitment drives, technical interviews, and entry-level job roles in the industry.
What You Will Do:
Following are high level responsibilities that you will play but not limited to:
· Develop and maintain data pipelines using Azure Data Factory (ADF) /Databricks for data integration and ETL processes.
· Design, implement, and optimize Power BI /Fabric reports and dashboards to deliver actionable business insights.
· Collaborate with data analysts, business users, and other teams to understand data requirements and deliver solutions using ADF and Power BI.
· Extract, transform, and load (ETL) data from various sources into cloud-based storage systems such as Azure Data Lake or Azure SQL Database.
· Work with large datasets and optimize queries and pipelines for performance and scalability.
· Ensure data quality, integrity, and availability throughout the data lifecycle.
· Automate repetitive data tasks, ensuring timely and accurate reporting.
· Monitor and troubleshoot data pipelines, addressing any performance or data issues promptly.
· Support data visualization and reporting tools, including Power BI, to enable business stakeholders to make data-driven decisions.
· Write clear, efficient, and maintainable code for data transformations and automation.
Required Qualifications:
· Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
· 8+ years of hands-on experience in Data Engineering, BI Developer or a similar role.
· Proficiency with Azure Data Factory (ADF), including the creation of data pipelines and managing data flows.
· Strong experience in Power BI, including report creation, dashboard development, and data modeling.
· Experience with SQL and database management (e.g., Azure SQL Database, SQL Server).
· Knowledge of cloud platforms, especially Microsoft Azure.
· Familiarity with data warehousing concepts and ETL processes.
· Experience working with cloud-based data storage solutions (e.g., Azure Data Lake, Azure Blob Storage).
· Strong programming skills in languages such as Python, SQL, or other relevant languages.
· Ability to troubleshoot and optimize data pipelines for performance and reliability.
Preferred Qualifications:
· Familiarity with data modeling techniques and practices for Power BI.
· Knowledge of Azure Databricks or other data processing frameworks.
· Knowledge of Microsoft Fabric or other Cloud Platforms.
What we need?
· B. Tech computer science or equivalent.
Job Title: Business Analyst (Fresher)
Job Type: Full-Time | Remote | 5 Days Working
Salary: ₹7,000 – ₹8,000 per month
Experience Required: 6 months to 1 year (Freshers with relevant skills only)
Joining: Immediate Joiners Only
About the Role:
We are looking for freshers who have strong foundational skills and knowledge in both Business Analysis. This is a position where you will be responsible for manually handling tasks related to both business testing functions.
Key Responsibilities:
- Gather and analyze business requirements from stakeholders
- Create documentation such as BRDs, FRDs, user stories, and process flows
- Perform manual testing of software applications
- Prepare test cases, test plans, and report bugs clearly
- Collaborate with development and business teams to ensure product quality and requirement clarity
- Provide timely updates and reports on progress and findings
Requirements:
- Must have skills and knowledge in Business Analysis
- Must be able to manage both roles manually and independently
- Proficiency in tools related to BA
- Excellent communication skills in English (spoken and written)
- Must have a personal laptop and a stable internet connection
- Must be available to join immediately
Who Should Apply:
- Freshers with 6 months to 1 year of experience in relevant roles
- Candidates who are confident in handling BA
- Individuals looking to build a strong foundation in both domains in a remote, full-time role
Job Description:
- Design and deploy OTBI dashboards and subject areas.
- Build BI Publisher reports (PO Print, Invoice Print, custom KPIs).
- Configure bursting logic and scheduled reports.
- Work with finance and SCM teams to deliver audit-compliant reports.
- Experience Required: At least 1–2 Oracle Cloud BI/reporting implementations with enhancement support.
Tools:
BI Publisher, OTBI, SQL, Excel Basics & Advance, Power BI, other BI tools.
Certifications (Mandatory): Oracle BI or OTBI Certification.
Must have Skills: OTBI Subject Areas, BI Publisher Layouts, SQL, Scheduling, Bursting Logic
Good to have skills - ERP Reporting (Finance + SCM), Any other tool – Tableau or Inhouse Developed tools
Soft Skills: Strong Communication, Collaboration, Product Roadmap, Growth Mindset, Ability to Navigate challenges during conversation with Stakeholders.

Senior Data Analyst – Power BI, GCP, Python & SQL
Job Summary
We are looking for a Senior Data Analyst with strong expertise in Power BI, Google Cloud Platform (GCP), Python, and SQL to design data models, automate analytics workflows, and deliver business intelligence that drives strategic decisions. The ideal candidate is a problem-solver who can work with complex datasets in the cloud, build intuitive dashboards, and code custom analytics using Python and SQL.
Key Responsibilities
* Develop advanced Power BI dashboards and reports based on structured and semi-structured data from BigQuery and other GCP sources.
* Write and optimize complex SQL queries (BigQuery SQL) for reporting and data modeling.
* Use Python to automate data preparation tasks, build reusable analytics scripts, and support ad hoc data requests.
* Partner with data engineers and stakeholders to define metrics, build ETL pipelines, and create scalable data models.
* Design and implement star/snowflake schema models and DAX measures in Power BI.
* Maintain data integrity, monitor performance, and ensure security best practices across all reporting systems.
* Drive initiatives around data quality, governance, and cost optimization on GCP.
* Mentor junior analysts and actively contribute to analytics strategy and roadmap.
Must-Have Skills
* Expert-level SQL : Hands-on experience writing complex queries in BigQuery , optimizing joins, window functions, CTEs.
* Proficiency in Python : Data wrangling, Pandas, NumPy, automation scripts, API consumption, etc.
* Power BI expertise : Building dashboards, using DAX, Power Query (M), custom visuals, report performance tuning.
* GCP hands-on experience : Especially with BigQuery, Cloud Storage, and optionally Cloud Composer or Dataflow.
* Strong understanding of data modeling, ETL pipelines, and analytics workflows.
* Excellent communication skills and the ability to explain data insights to non-technical audiences.
Preferred Qualifications
* Experience in version control (Git) and working in CI/CD environments.
* Google Professional Data Engineer
* PL-300: Microsoft Power BI Data Analyst Associate

Job Summary
We are looking for a highly driven and ambitious person with the drive to lead multi-layered teams and who place an inordinate amount of importance on learning and improvising. With a relentless focus on execution, you need to be highly data-driven and have a higher-than-normal sense of ownership. Prior experience in Operations/procurement/program management/ process standardization is mandatory.
Job Responsibilities:
- Take complete ownership of Project Execution (including P & L)
- Take up existing projects and optimize them by closely working with the product team.
- Handle general operations which include daily execution, driving volume, tracking progress, highlighting flags and daily reporting.
- Analyze training needs and provide training.
- Collect feedback on a regular basis and resolve any issues.
- Monitoring the team performance to achieve the KRA’s.
- Mentor and motivate the student workforce. Demonstrate strong people engagement skills.
- Run the pilot of projects.
- Suggest improvements in processes at every level in operations.
- Set goals/KPIs/targets for the team members.
Desired Skills
- 0-1 years of relevant experience in managing an operations team.
- Proven ability in driving tightly controlled operational metrics
- Strong process orientation & business acumen
- You should have good people management, team building and program management skills.
- Strongly inclined to do high-quality and impactful work in a dynamic and unstructured environment.
- Higher than normal sense of ownership with a clear bias for action.
- Relevant educational qualification.
- Must have impeccable verbal and written communication skills (Both English and Hindi).


Senior Data Engineer
Location: Bangalore, Gurugram (Hybrid)
Experience: 4-8 Years
Type: Full Time | Permanent
Job Summary:
We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.
This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.
Key Responsibilities:
PostgreSQL & Data Modeling
· Design and optimize complex SQL queries, stored procedures, and indexes
· Perform performance tuning and query plan analysis
· Contribute to schema design and data normalization
Data Migration & Transformation
· Migrate data from multiple sources to cloud or ODS platforms
· Design schema mapping and implement transformation logic
· Ensure consistency, integrity, and accuracy in migrated data
Python Scripting for Data Engineering
· Build automation scripts for data ingestion, cleansing, and transformation
· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)
· Maintain reusable script modules for operational pipelines
Data Orchestration with Apache Airflow
· Develop and manage DAGs for batch/stream workflows
· Implement retries, task dependencies, notifications, and failure handling
· Integrate Airflow with cloud services, data lakes, and data warehouses
Cloud Platforms (AWS / Azure / GCP)
· Manage data storage (S3, GCS, Blob), compute services, and data pipelines
· Set up permissions, IAM roles, encryption, and logging for security
· Monitor and optimize cost and performance of cloud-based data operations
Data Marts & Analytics Layer
· Design and manage data marts using dimensional models
· Build star/snowflake schemas to support BI and self-serve analytics
· Enable incremental load strategies and partitioning
Modern Data Stack Integration
· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka
· Support modular pipeline design and metadata-driven frameworks
· Ensure high availability and scalability of the stack
BI & Reporting Tools (Power BI / Superset / Supertech)
· Collaborate with BI teams to design datasets and optimize queries
· Support development of dashboards and reporting layers
· Manage access, data refreshes, and performance for BI tools
Required Skills & Qualifications:
· 4–6 years of hands-on experience in data engineering roles
· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)
· Advanced Python scripting skills for automation and ETL
· Proven experience with Apache Airflow (custom DAGs, error handling)
· Solid understanding of cloud architecture (especially AWS)
· Experience with data marts and dimensional data modeling
· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)
· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI
· Version control (Git) and CI/CD pipeline knowledge is a plus
· Excellent problem-solving and communication skills
About the Role
We are seeking an innovative Data Scientist specializing in Natural Language Processing (NLP) to join our technology team in Bangalore. The ideal candidate will harness the power of language models and document extraction techniques to transform legal information into accessible, actionable insights for our clients.
Responsibilities
- Develop and implement NLP solutions to automate legal document analysis and extraction
- Create and optimize prompt engineering strategies for large language models
- Design search functionality leveraging semantic understanding of legal documents
- Build document extraction pipelines to process unstructured legal text data
- Develop data visualizations using PowerBI and Tableau to communicate insights
- Collaborate with product and legal teams to enhance our tech-enabled services
- Continuously improve model performance and user experience.
Requirements
- Bachelor's degree in relevant field
- 1-5 years of professional experience in data science, with focus on NLP applications
- Demonstrated experience working with LLM APIs (e.g., OpenAI, Anthropic, )
- Proficiency in prompt engineering and optimization techniques
- Experience with document extraction and information retrieval systems
- Strong skills in data visualization tools, particularly PowerBI and Tableau
- Excellent programming skills in Python and familiarity with NLP libraries
- Strong understanding of legal terminology and document structures (preferred)
- Excellent communication skills in English
What We Offer
- Competitive salary and benefits package
- Opportunity to work at India's largest legal tech company
- Professional growth in the fast-evolving legal technology sector
- Collaborative work environment with industry experts
- Modern office located in Bangalore
- Flexible work arrangements
Qualified candidates are encouraged to apply with a resume highlighting relevant experience with NLP, prompt engineering, and data visualization tools.
Location: Bangalore, India
We’re looking for an experienced SQL Developer with 3+ years of hands-on experience to join our growing team. In this role, you’ll be responsible for designing, developing, and maintaining SQL queries, procedures, and data systems that support our business operations and decision-making processes. You should be passionate about data, highly analytical, and capable of working both independently and collaboratively with cross-functional teams.
Key Responsibilities:
Design, develop, and maintain complex SQL queries, stored procedures, functions, and views.
Optimize existing queries for performance and efficiency.
Collaborate with data analysts, developers, and stakeholders to understand requirements and translate them into robust SQL solutions.
Design and implement ETL processes to move and transform data between systems.
Perform data validation, troubleshooting, and quality checks.
Maintain and improve existing databases, ensuring data integrity, security, and accessibility.
Document code, processes, and data models to support scalability and maintainability.
Monitor database performance and provide recommendations for improvement.
Work with BI tools and support dashboard/report development as needed.
Requirements:
3+ years of proven experience as an SQL Developer or in a similar role.
Strong knowledge of SQL and relational database systems (e.g., MS SQL Server, PostgreSQL, MySQL, Oracle).
Experience with performance tuning and optimization.
Proficiency in writing complex queries and working with large datasets.
Experience with ETL tools and data pipeline creation.
Familiarity with data warehousing concepts and BI reporting.
Solid understanding of database security, backup, and recovery.
Excellent problem-solving skills and attention to detail.
Good communication skills and ability to work in a team environment.
Nice to Have:
Experience with cloud-based databases (AWS RDS, Google BigQuery, Azure SQL).
Knowledge of Python, Power BI, or other scripting/analytics tools.
Experience working in Agile or Scrum environments.

About the company:
Ketto is Asia's largest tech-enabled crowdfunding platform with a vision - Healthcare for all. We are a profit-making organization with a valuation of more than 100 Million USD. With over 1,100 crores raised from more than 60 lakh donors, we have positively impacted the lives of 2 lakh+ campaigners. Ketto has embarked on a high-growth journey, and we would like you to be part of our family, helping us to create a large-scale impact on a daily basis by taking our product to the next level
Role Overview:
Ketto, Asia's largest crowdfunding platform, is looking for an innovative Product Analyst to take charge of our data systems, reporting frameworks, and generative AI initiatives. This role is pivotal in ensuring data integrity and reliability, driving key insights that fuel strategic decisions, and implementing automation through AI. This position encompasses the full data and analytics lifecycle—from requirements gathering to design planning—alongside implementing advanced analytics and generative AI solutions to support Ketto’s mission.
Key Responsibilities
● Data Strategy & Automation:
○ Lead data collection, processing, and quality assurance processes to ensure accuracy, completeness, and relevance.
○ Explore opportunities to incorporate generative AI models to automate and optimize processes, enhancing efficiencies in analytics, reporting, and decision-making.
● Data Analysis & Insight Generation:
○ Conduct in-depth analyses of user behaviour, campaign performance, and platform metrics to uncover insights that support crowdfunding success.
○ Translate complex data into clear, actionable insights that drive strategic decisions, providing stakeholders with the necessary information to enhance business outcomes.
● Reporting & Quality Assurance:
○ Design and maintain a robust reporting framework to deliver timely insights, enhancing data reliability and ensuring stakeholders are well-informed.
○ Monitor and improve data accuracy, consistency, and integrity across all data processes, identifying and addressing areas for enhancement.
● Collaboration & Strategic Planning:
○ Work closely with Business, Product, and IT teams to align data initiatives with Ketto’s objectives and growth strategy.
○ Propose data-driven strategies that leverage AI and automation to tackle business challenges and scale impact across the platform.
○ Mentor junior data scientists and analysts, fostering a culture of data-driven decision-making.
Required Skills and Qualifications
● Technical Expertise:
○ Strong background in SQL, Statistics and Maths
● Analytical & Strategic Mindset:
○ Proven ability to derive meaningful, actionable insights from large data sets and translate findings into business strategies.
○ Experience with statistical analysis, advanced analytics
● Communication & Collaboration:
○ Exceptional written and verbal communication skills, capable of explaining complex data insights to non-technical stakeholders.
○ Strong interpersonal skills to work effectively with cross-functional teams, aligning data initiatives with organisational goals.
● Preferred Experience:
○ Proven experience in advanced analytics roles
○ Experience leading data lifecycle management, model deployment, and quality assurance initiatives.
Why Join Ketto?
At Ketto, we’re committed to driving social change through innovative data and AI solutions. As our Sr. Product Analyst, you’ll have the unique opportunity to leverage advanced data science and generative AI to shape the future of crowdfunding in Asia. If you’re passionate about using data and AI for social good, we’d love to hear from you!
Job Summary:
As a Data Engineering Lead, your role will involve designing, developing, and implementing interactive dashboards and reports using data engineering tools. You will work closely with stakeholders to gather requirements and translate them into effective data visualizations that provide valuable insights. Additionally, you will be responsible for extracting, transforming, and loading data from multiple sources into Power BI, ensuring its accuracy and integrity. Your expertise in Power BI and data analytics will contribute to informed decision-making and support the organization in driving data-centric strategies and initiatives.
1) Required Experience : 6+ years
2) Lead Experience : 2+ years
3) Mandatory Skills : Power BI , SQL , Azure Data Factory
4) Budget Range : 28 - 32 LPA
5) Locations : Hyderabad , Indore and Ahmedabad
6) Immediate joiners preferrable
7) Total 4 rounds will be conducted and Candidate should attend 1 round in F2F in Hyderabad , Indore and Ahmedabad locations
8) Candidate should be available to work all 5 days in work from office
We are looking for you!
---> As an ideal candidate for the Data Engineering Lead position, you embody the qualities of a team player with a relentless get-it-done attitude. Your intellectual curiosity and customer focus drive you to continuously seek new ways to add value to your job accomplishments. You thrive under pressure, maintaining a positive attitude and understanding that your career is a journey. You are willing to make the right choices to support your growth. In addition to your excellent communication skills, both written and verbal, you have a proven ability to create visually compelling designs using tools like Power BI and Tableau that effectively communicate our core values.
---> You build high-performing, scalable, enterprise-grade applications and teams. Your creativity and proactive nature enable you to think differently, find innovative solutions, deliver high-quality outputs, and ensure customers remain referenceable. With over eight years of experience in data engineering, you possess a strong sense of self-motivation and take ownership of your responsibilities. You prefer to work independently with little to no supervision.
---> You are process-oriented, adopt a methodical approach, and demonstrate a quality-first mindset. You have led mid to large-size teams and accounts, consistently using constructive feedback mechanisms to improve productivity, accountability, and performance within the team. Your track record showcases your results-driven approach, as you have consistently delivered successful projects with customer case studies published on public platforms. Overall, you possess a unique combination of skills, qualities, and experiences that make you an ideal fit to lead our data engineering team(s). You value inclusivity and want to join a culture that empowers you to show up as your authentic self.
---> You know that success hinges on commitment, our differences make us stronger, and the finish line is always sweeter when the whole team crosses together. In your role, you should be driving the team using data, data, and more data. You will manage multiple teams, oversee agile stories and their statuses, handle escalations and mitigations, plan ahead, identify hiring needs, collaborate with recruitment teams for hiring, enable sales with pre-sales teams, and work closely with development managers/leads for solutioning and delivery statuses, as well as architects for technology research and solutions.
What You Will Do:
- Analyze Business Requirements.
- Analyze the Data Model and do GAP analysis with Business Requirements and Power BI.
- Design and Model Power BI schema.
- Transformation of Data in Power BI/SQL/ETL Tool.
- Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas.
- Experience writing SQL Queries and stored procedures.
- Design effective Power BI solutions based on business requirements.
- Manage a team of Power BI developers and guide their work.
- Integrate data from various sources into Power BI for analysis.
- Optimize performance of reports and dashboards for smooth usage.
- Collaborate with stakeholders to align Power BI projects with goals.
- Knowledge of Data Warehousing(must), Data Engineering is a plus
What we need?
• B. Tech computer science or equivalent
• Minimum 6+ years of relevant experience
Experienced Senior Functional Consultant with 7+ years of experience in Microsoft Dynamics 365 Marketing Automation.
Key Requirements:
- 7+ years of experience in MS Dynamics 365 Marketing Automation
- Expertise in Customer Segmentation, Real-Time Marketing & Personalization
- Strong experience in Email & Campaign Automation
- Hands-on knowledge of Power Automate for Marketing Workflows
- Proficiency in Data Analytics, Reporting (Customer Insights AI), and Customer Journey Orchestration
- Experience with Dynamics 365 Sales, Power BI, and Integration
- Self-management and accountability for SLAs & deliverables
Hi
Job Title: Data Visualization Engineer
Experience: 2 to 4 years
Location: Gurgaon (Hybrid)
Employment Type: Full-time
Job Description:
We are seeking a skilled Data Visualization Engineer with expertise in Qlik Sense and experience working with reporting tools like PowerBI, Tableau, Looker, and Qlik Sense. The ideal candidate will have a strong understanding of QVF and QVD structures, basic HTTP API integrations, and end-to-end data pipelines. Some knowledge of Python for data processing and automation will be a plus. This role will primarily focus on Qlik Sense reporting.
Key Responsibilities:
1. Data Visualization & Reporting
- Design, develop, and maintain interactive dashboards and reports using Qlik Sense.
- Work with PowerBI, Tableau, Looker, and Qlik Sense to create compelling data visualizations.
- Ensure seamless data representation and storytelling through dashboards.
2. Qlik Sense Development & Optimization
- Develop and manage QVF and QVD structures for optimized data retrieval.
- Implement best practices in Qlik Sense scripting, data modeling, and performance tuning.
- Maintain and optimize existing Qlik Sense applications.
3. Data Integration & API Interactions
- Utilize basic HTTP APIs to integrate external data sources into dashboards.
- Work with data teams to ensure smooth data ingestion and transformation for visualization.
4. End-to-End Data Pipeline Understanding
- Collaborate with data engineers to understand and optimize data flows from source to visualization.
- Ensure data consistency, integrity, and performance in reporting solutions.
5. Scripting & Automation
- Utilize Python for data manipulation, automation, and minor custom integrations.
- Improve reporting workflows through automation scripts and process optimizations.
Technical Expertise Required:
- 2 to 4 years of experience in Data Visualization or BI Reporting roles.
- Strong experience with Qlik Sense (QVF & QVD structures, scripting, visualization).
- Hands-on experience with PowerBI, Tableau, Looker.
- Basic understanding of HTTP APIs for data integration.
- Understanding of end-to-end data pipelines.
- Knowledge of Python for automation and data transformation.
- Experience in performance optimization of dashboards and reports.
- Strong analytical and problem-solving skills.
Preferred Qualifications:
- Experience in data modeling and ETL concepts.
- Familiarity with cloud-based data visualization solutions.
- Understanding of data governance and security best practices.
We are seeking a highly skilled and experienced Power BI Lead / Architect to join our growing team. The ideal candidate will have a strong understanding of data warehousing, data modeling, and business intelligence best practices. This role will be responsible for leading the design, development, and implementation of complex Power BI solutions that provide actionable insights to key stakeholders across the organization.
Location - Pune (Hybrid 3 days)
Responsibilities:
Lead the design, development, and implementation of complex Power BI dashboards, reports, and visualizations.
Develop and maintain data models (star schema, snowflake schema) for optimal data analysis and reporting.
Perform data analysis, data cleansing, and data transformation using SQL and other ETL tools.
Collaborate with business stakeholders to understand their data needs and translate them into effective and insightful reports.
Develop and maintain data pipelines and ETL processes to ensure data accuracy and consistency.
Troubleshoot and resolve technical issues related to Power BI dashboards and reports.
Provide technical guidance and mentorship to junior team members.
Stay abreast of the latest trends and technologies in the Power BI ecosystem.
Ensure data security, governance, and compliance with industry best practices.
Contribute to the development and improvement of the organization's data and analytics strategy.
May lead and mentor a team of junior Power BI developers.
Qualifications:
8-12 years of experience in Business Intelligence and Data Analytics.
Proven expertise in Power BI development, including DAX, advanced data modeling techniques.
Strong SQL skills, including writing complex queries, stored procedures, and views.
Experience with ETL/ELT processes and tools.
Experience with data warehousing concepts and methodologies.
Excellent analytical, problem-solving, and communication skills.
Strong teamwork and collaboration skills.
Ability to work independently and proactively.
Bachelor's degree in Computer Science, Information Systems, or a related field preferred.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Job Title: Developer
Location: [Company Location or Remote]
Job Type: [Full-time/Part-time/Contract]
Experience Level: [Entry-level/Junior/Mid-level/Senior]
Job Summary:
We are seeking a skilled Developer to design, develop, and maintain high-quality software solutions. The ideal candidate should have strong problem-solving abilities, proficiency in programming languages, and a passion for technology. You will work closely with cross-functional teams to develop scalable and efficient applications.
Key Responsibilities:
- Design, develop, test, and deploy software applications.
- Write clean, efficient, and well-documented code.
- Collaborate with designers, product managers, and other developers.
- Troubleshoot and debug applications to optimize performance.
- Stay updated with emerging technologies and industry trends.
- Participate in code reviews and provide constructive feedback.
- Integrate third-party APIs and databases as needed.
- Ensure software security, scalability, and maintainability.
Required Skills & Qualifications:
- Bachelor’s/Master’s degree in Computer Science, Engineering, or a related field.
- Proficiency in [mention relevant programming languages, e.g., Python, Java, JavaScript, C++].
- Experience with [mention frameworks, e.g., React, Angular, Django, Flask, Spring Boot].
- Knowledge of databases such as [MySQL, PostgreSQL, MongoDB].
- Familiarity with version control systems like Git and GitHub.
- Strong problem-solving and analytical skills.
- Excellent teamwork and communication skills.
- Ability to work in an agile development environment.
Preferred Qualifications (if applicable):
- Experience in cloud technologies like AWS, Azure, or Google Cloud.
- Knowledge of DevOps practices and CI/CD pipelines.
- Experience with containerization tools like Docker and Kubernetes.
- Understanding of AI/ML concepts (for AI-related roles).
Benefits:
- Competitive salary and performance-based bonuses.
- Flexible work hours and remote work options.
- Health insurance and wellness programs.
- Career development and learning opportunities.
- Friendly and collaborative work culture.
Job Title : Senior AWS Data Engineer
Experience : 5+ Years
Location : Gurugram
Employment Type : Full-Time
Job Summary :
Seeking a Senior AWS Data Engineer with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.
Key Responsibilities :
- Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
- Maintain data lakes & warehouses for analytics.
- Ensure data integrity through quality checks.
- Collaborate with data scientists & engineers to deliver solutions.
Qualifications :
- 7+ Years in Data Engineering.
- Expertise in AWS services, SQL, Python, Spark, Kafka.
- Experience with CI/CD, DevOps practices.
- Strong problem-solving skills.
Preferred Skills :
- Experience with Snowflake, Databricks.
- Knowledge of BI tools (Tableau, Power BI).
- Healthcare/Insurance domain experience is a plus.
Job Opening: ERP Developer – Noida/Gurgaon
📍 Location: Noida/Gurgaon
💼 Experience: 5-8 Years
We are looking for an ERP Developer with expertise in D365 Finance and Operations to join our team. If you have hands-on experience in Microsoft Dynamics AX 2012 R3, Power BI, Power Apps, MS SQL Server, and SSRS reports, we want to hear from you!
🔹 Key Skills & Expertise:
✅ D365 Finance & Operations (Finance Consultant Role)
✅ Microsoft Dynamics AX 2012 R3 / D365 Technical Development
✅ Power BI & Power Apps Platform
✅ MS SQL Server & SSRS Reports
✅ 24x7 ERP Support & Implementation