- EDW/BI experience of 15+ years with at least 2-3 end to end EDW implementation experience as Solution or Technical program manager
- Must have at least ONE Azure Data Platform implementation experience as Solution or Technical Project manager (Azure, Databricks, ADF, PySpark)
- Must have technology experience in any of the ETL tools like Informatica, Datastage and etc.
- Excellent communication and presentation skills
- Should be well versed with project estimation, project planning, execution, tracking & monitoring
- Should be well versed with delivery metrics in Waterfall and/or Agile delivery models, scrum management
- Preferred to have technology experience of any of the BI tools like MicroStrategy, Tableau, Power BI and etc.
About MNC
Similar jobs
Who is this for
If solving business challenges drives you. This is the place to be. Fornax is a team of cross-functional individuals who solve critial business challenges using core concept of analytics, critial thinking.
We are seeking a skilled Business Analyst who has worked with a D2C/E-commerce brand in India. The ideal candidate will possess a strong blend of functional and technical expertise, particularly in Google Analytics, Google Ads, Facebook Ads, Amazon Ads. Good understanding of the entire D2C / E-Commerce marketing value chain.
Key Responsibilities:
- Analyze and interpret e-commerce data to identify trends, opportunities, and areas for improvement.
- Design creative data solutions by understanding the D2C /DTC value chain.
- Analyse business needs of stakeholders and customers.
- Collaborte with internal and external teams to determine the project scope and vision
- Gather Customer requirements via workshop questionnaires, surveys, site visit, Workflow storyboards, use cases and scenario mappings.
- Study and create detailed comparisions solution architectures for deployment.
- Monitor and control the execution of the project
- Conduct and monitor extensive customer research done on brands.
- Translate Business Requirements into functional requirements
- Create process models, BPMN diagrams to provide end to visbility of the solution to Business and dev teams.
- Communicate progress, issues, development to client teams. verbally or through written documentation.
- Create extensive project scope documentations to keep project and client teams on the same page.
- Develop and improve existing templates for capturing customer requirements.
Key Skills
- Exceptional Analytical and Problem-Solving Abilities: Demonstrated excellence in analytical reasoning, including a keen ability to analyze complex situations and data, leading to creative solutions and successful problem resolution.
- Great Overround understanding of E-commerce and D2C sector in India. Prior experience in working with a D2C brand or directly working on agency end with D2C brand
- Bachelor’s degree in Analytics, Statistics, Computer Science, Engineering, or a related field.
- Minimum of 3 years of experience in analytics, data science, or a related consulting role.
- Proficient in data analysis tools and programming languages such as SQL, Python, R, or SAS.
- Strong problem-solving skills and the ability to work with complex datasets.
- Excellent communication and interpersonal skills to effectively convey insights to non-technical stakeholders.
- Experience in managing projects and leading teams is highly advantageous.
Enterprise Data Architect - Dataeconomy (25+ Years Experience)
About Dataeconomy:
Dataeconomy is a rapidly growing company at the forefront of Information Technology. We are driven by data and committed to using it to make better decisions, improve our products, and deliver exceptional value to our customers.
Job Summary:
Dataeconomy seeks a seasoned and strategic Enterprise Data Architect to lead the company's data transformation journey. With 25+ years of experience in data architecture and leadership, you will be pivotal in shaping our data infrastructure, governance, and culture. You will leverage your extensive expertise to build a foundation for future growth and innovation, ensuring our data assets are aligned with business objectives and drive measurable value.
Responsibilities:
Strategic Vision and Leadership:
Lead the creation and execution of a long-term data strategy aligned with the company's overall vision and goals.
Champion a data-driven culture across the organization, fostering cross-functional collaboration and data literacy.
Advise senior leadership on strategic data initiatives and their impact on business performance.
Architecture and Modernization:
Evaluate and modernize the existing data architecture, recommending and implementing innovative solutions.
Design and implement a scalable data lake/warehouse architecture for future growth.
Advocate for and adopt cutting-edge data technologies and best practices.
ETL Tool Experience (8+ years):
Extensive experience in designing, developing, and implementing ETL (Extract, Transform, Load) processes using industry-standard tools such as Informatica PowerCenter, IBM DataStage, Microsoft SSIS, or open-source options like Apache Airflow.
Proven ability to build and maintain complex data pipelines that integrate data from diverse sources, transform it into usable formats, and load it into target systems.
Deep understanding of data quality and cleansing techniques to ensure the accuracy and consistency of data across the organization.
Data Governance and Quality:
Establish and enforce a comprehensive data governance framework ensuring data integrity, consistency, and security.
Develop and implement data quality standards and processes for continuous data improvement.
Oversee the implementation of master data management and data lineage initiatives.
Collaboration and Mentorship:
Mentor and guide data teams, including architects, engineers, and analysts, on data architecture principles and best practices.
Foster a collaborative environment where data insights are readily shared and acted upon across the organization.
Build strong relationships with business stakeholders to understand and translate their data needs into actionable solutions.
Qualifications:
Education: master’s degree in computer science, Information Systems, or related field; Ph.D. preferred.
Experience: 25+ years of experience in data architecture and design, with 10+ years in a leadership role.
Technical Skills:
Deep understanding of TOGAF, AWS, MDM, EDW, Hadoop ecosystem (MapReduce, Hive, HBase, Pig, Flume, Scoop), cloud data platforms (Azure Synapse, Google Big Query), modern data pipelines, streaming analytics, data governance frameworks.
Proficiency in programming languages (Java, Python, SQL), scripting languages (Bash, Python), data modelling tools (ER diagramming software), and BI tools.
Extensive expertise in ETL tools (Informatica PowerCenter, IBM DataStage, Microsoft SSIS, Apache Airflow)
Familiarity with emerging data technologies (AI/ML, blockchain), data security and compliance frameworks.
Soft Skills:
Outstanding communication, collaboration, and leadership skills.
Strategic thinking and problem-solving abilities with a focus on delivering impactful solutions.
Strong analytical and critical thinking skills.
Ability to influence and inspire teams to achieve goals.
Scrum Master
Job Sector: IT, Software
Job Type: Permanent
Location: Chennai
Experience: 5 -6 Years
Salary: 10 - 12 LPA
Education: B.E/BTech
Notice Period: Immediate
Key Skills: Web apps developer, scrum master, Agile, waterfall, Azure, AWS
Contact at triple eight two zero nine four two double seven
Job Description:
Knowledge & Experience
- 5+ years of Scrum Master experience.
- Experience working with an offshore/onshore team model
- Thorough understanding of agile software development methodologies, values, and procedures.
- Thorough understanding of the software development lifecycle.
- Understanding of organization’s development platform and languages.
- Ability to understand technical issues.
Where: Hyderabad/ Bengaluru, India (Hybrid Mode 3 Days/Week in Office)
Job Description:
- Collaborate with stakeholders to develop a data strategy that meets enterprise needs and industry requirements.
- Create an inventory of the data necessary to build and implement a data architecture.
- Envision data pipelines and how data will flow through the data landscape.
- Evaluate current data management technologies and what additional tools are needed.
- Determine upgrades and improvements to current data architectures.
- Design, document, build and implement database architectures and applications. Should have hands-on experience in building high scale OLAP systems.
- Build data models for database structures, analytics, and use cases.
- Develop and enforce database development standards with solid DB/ Query optimizations capabilities.
- Integrate new systems and functions like security, performance, scalability, governance, reliability, and data recovery.
- Research new opportunities and create methods to acquire data.
- Develop measures that ensure data accuracy, integrity, and accessibility.
- Continually monitor, refine, and report data management system performance.
Required Qualifications and Skillset:
- Extensive knowledge of Azure, GCP clouds, and DataOps Data Eco-System (super strong in one of the two clouds and satisfactory in the other one)
- Hands-on expertise in systems like Snowflake, Synapse, SQL DW, BigQuery, and Cosmos DB. (Expertise in any 3 is a must)
- Azure Data Factory, Dataiku, Fivetran, Google Cloud Dataflow (Any 2)
- Hands-on experience in working with services/technologies like - Apache Airflow, Cloud Composer, Oozie, Azure Data Factory, and Cloud Data Fusion (Expertise in any 2 is required)
- Well-versed with Data services, integration, ingestion, ELT/ETL, Data Governance, Security, and Meta-driven Development.
- Expertise in RDBMS (relational database management system) – writing complex SQL logic, DB/Query optimization, Data Modelling, and managing high data volume for mission-critical applications.
- Strong grip on programming using Python and PySpark.
- Clear understanding of data best practices prevailing in the industry.
- Preference to candidates having Azure or GCP architect certification. (Either of the two would suffice)
- Strong networking and data security experience.
Awareness of the Following:
- Application development understanding (Full Stack)
- Experience on open-source tools like Kafka, Spark, Splunk, Superset, etc.
- Good understanding of Analytics Platform Landscape that includes AI/ML
- Experience in any Data Visualization tool like PowerBI / Tableau / Qlik /QuickSight etc.
About Us
Gramener is a design-led data science company. We build custom Data & AI solutions that help solve complex business problems with actionable insights and compelling data stories. We partner with enterprise data and digital transformation teams to improve the data-driven decision-making culture across the organization. Our open standard low-code platform, Gramex, rapidly builds engaging Data & AI solutions across multiple business verticals and use cases. Our solutions and technology have been recognized by analysts such as Gartner and Forrester and have won several awards.
We Offer You:
- a chance to try new things & take risks.
- meaningful problems you'll be proud to solve.
- people you will be comfortable working with.
- transparent and innovative work environment.
To know more about us visit Gramener Website and Gramener Blog.
If anyone looking for the same, kindly share below mentioned details.
Total Experience:
Relevant Experience:
Notice Period:
CTCT:
ECTC:
Current Location:
Exp:2-6yrs
Location: Navi Mumbai
80% overseas travelling.
CCNA, MSSA
2) Common security technologies and practices.
3) Storage Platforms technologies (RAID, SAN, NAS, tape libraries).
4) High availability technologies (i.e. based on OS, Veritas, Oracle).
5) Experience with business intelligence applications such as Crystal Reports, Oracle BI.
6) Previous experience from hospital environment and radiological workflow, including technical skills on Radiology Information systems (RIS) and Picture Archiving Communication Systems (PACS) are desirable.
7) Virtualization technologies VMware / Hyper-V.
8) Knowledge of DICOM and HL7 technologies.
- Total experience of 10-12 years with excellent credentials in BI/DW and having retail domain experience
- Minimum 5 years of project management experience delivering medium to large engagements
- Experience in Agile project management and scrum methodology
- Must have experience of delivering engagements with Azure Data Platform (Azure, Databricks, ADF, PySpark) and/or Microstrategy/Business Objects
- Should be able to plan, manage and execute projects end to end independently
- Should have experience of drafting SOWs, project status reports and financials management
Job Description
Experience : 10+ Years
Location : Pune
Job Requirements:
- Minimum of 10+ years of experience with a proven record of increased responsibility
- Hands on experience in design, development and managing Big Data, Cloud, Data warehousing
- and Business Intelligence projects
- Experience of managing projects in Big Data, Cloud, Data warehousing, Business Intelligence
- Using open source or top of the line tools and technologies
- Good knowledge of Dimensional Modeling
- Experience of working with any ETL and BI Reporting tools
- Experience of managing medium to large projects, preferably on Big Data
- Proven experience in project planning, estimation, execution and implementation of medium to
- large projects
- Should be able to effectively communicate in English
- Strong management and leadership skills, with proven ability to develop and manage client
- relationships
- Proven problem-solving skills from both technical and managerial perspectives
- Attention to detail and a commitment to excellence and high standards
- Excellent interpersonal and communication skills, both verbal and written
- Position is remote with occasional travel to other offices, client sites, conventions, training
- locations, etc.
- Bachelor’s degree in Computer Science, Business\Economics, or a related field or demonstrated,
- equivalent/practical knowledge or experience
Job Responsibilities:
- Day to day project management, scrum and agile management including project planning, delivery
- and execution of Big Data and
- Primary Point of contact for customer related to all project engagements, delivery and project
- escalations
- Design right architecture and technology stack depending on business requirement on Cloud / Big
- Data and BI related technologies both some on-premise and on cloud
- Liaise with key stakeholders to define the Cloud / Big data solutions roadmap, prioritize the
- deliverables
- Responsible for end to end project delivery of Cloud / Big Data Solutions from project estimations,
- project planning, resourcing and monitoring perspective
- Drive and participate in requirements gathering workshops, estimation discussions, design
- meetings and status review meetings
- Support & assist the team in resolving issues during testing and when the system is in production
- Involved in the full customer lifecycle with a goal to make customers successful and increase
- revenue and retention
- Interface with the offshore engineering team to solve customer issues
- Develop programs that meet customer needs with respect to functionality, performance,
- scalability, reliability, schedule, principles and recognized industry standards
- Requirement analysis and documentation
- Manage day-to-day operational aspects of a project and scope
- Prepare for engagement reviews and quality assurance procedures
- Visit and/or host clients to strengthen business relationships