Company Overview:
At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.
We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
Job Description :
- Candidate should have strong technical and analytical skill with more in SQL Server, reporting tools like Tableau, Power BI, SSRS and .Net.
- Candidate should have experience for proper understanding of the project deliverables.
- Candidate should be responsible for the respective tasks assigned in the project.
- Candidate will be responsible for the deliverable with proper quality, in planned time and cost adhering to the industry standards that will be defined for the project.
- Candidate should be involved in client interaction.
- Candidate should possess excellent communication skills.
Required Skills : BI Gateway, MS SQL Server, Tableau, Power BI,.Net , OLAP, UI/UX , Dashboard Building
Experience : 5+Years
Job Location : Remote/Saudi Arabia
Work Timings : 2.30 pm- 11.30 pm
About Codvoai
At Codvo, we accelerate Cloud, AI, and Transformation roadmaps while offering most satisfying mix of work-life balance, quality of living, and cutting edge work to our employees.
We deliver value through our unique "Virtual Silicon Valley" model where we bring seasoned experts and global talent together as a Product Oriented Deliver (POD) unit to successfully deliver on your next roadmap priorities.
Our “Virtual Silicon Valley” PODs deliver better success and speed because they are self-managed, have right expertise mix, and most importantly are aligned to work in your time zone. The goal is to balance speed, expertise mix, and cost while ensuring the success of core product development, design, and transformation activities.
We are proud to have our customers ready to vouch for us and share their success stories. Our teams of scientists, engineers, architects, and designers have helped AI-driven companies, fast-growing Fintechs, Wealth Management & Healthcare startups, Energy companies, and US Defense contractors accelerate their product and transformation roadmaps.
Similar jobs
Position Overview: We are seeking a talented Data Engineer with expertise in Power BI to join our team. The ideal candidate will be responsible for designing and implementing data pipelines, as well as developing insightful visualizations and reports using Power BI. Additionally, the candidate should have strong skills in Python, data analytics, PySpark, and Databricks. This role requires a blend of technical expertise, analytical thinking, and effective communication skills.
Key Responsibilities:
- Design, develop, and maintain data pipelines and architectures using PySpark and Databricks.
- Implement ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes.
- Collaborate with data analysts and business stakeholders to understand data requirements and translate them into actionable insights.
- Develop interactive dashboards, reports, and visualizations using Power BI to communicate key metrics and trends.
- Optimize and tune data pipelines for performance, scalability, and reliability.
- Monitor and troubleshoot data infrastructure to ensure data quality, integrity, and availability.
- Implement security measures and best practices to protect sensitive data.
- Stay updated with emerging technologies and best practices in data engineering and data visualization.
- Document processes, workflows, and configurations to maintain a comprehensive knowledge base.
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or related field. (Master’s degree preferred)
- Proven experience as a Data Engineer with expertise in Power BI, Python, PySpark, and Databricks.
- Strong proficiency in Power BI, including data modeling, DAX calculations, and creating interactive reports and dashboards.
- Solid understanding of data analytics concepts and techniques.
- Experience working with Big Data technologies such as Hadoop, Spark, or Kafka.
- Proficiency in programming languages such as Python and SQL.
- Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud.
- Excellent analytical and problem-solving skills with attention to detail.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Ability to work independently and manage multiple tasks simultaneously in a fast-paced environment.
Preferred Qualifications:
- Advanced degree in Computer Science, Engineering, or related field.
- Certifications in Power BI or related technologies.
- Experience with data visualization tools other than Power BI (e.g., Tableau, QlikView).
- Knowledge of machine learning concepts and frameworks.
Brief:
As a BI Developer at GradRight, you’ll be working with Tableau and supporting data sources to build reports for the requirements of various business teams.
Responsibilities:
- Translate business needs to technical specifications for reports and dashboards
- Design, build and deploy BI solutions
- Maintain and support data analytics platforms (e.g. Tableau, Mixpanel, Google Analytics, etc)
- Evaluate and improve existing BI systems
- Collaborate with teams to integrate systems
- Develop and execute database queries, conduct analysis and prepare data to be shared with respective stakeholders
- Create visualizations and reports for requested projects
- Develop and update technical documentation around reports
Requirements:
- At least 3 years of proven experience as a BI Developer
- Experience at a startup
- Background in data warehouse design (e.g. dimensional modeling) and data mining
- In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework
- Working knowledge of Tableau
- Knowledge of SQL queries and MongoDB
- Proven abilities to take initiative and be innovative
- Analytical mind with a problem-solving aptitude
The Platform Data Science team works at the intersection of data science and engineering. Domain experts develop and advance platforms, including the data platforms, machine learning platform, other platforms for Forecasting, Experimentation, Anomaly Detection, Conversational AI, Underwriting of Risk, Portfolio Management, Fraud Detection & Prevention and many more. We also are the Data Science and Analytics partners for Product and provide Behavioural Science insights across Jupiter.
About the role:
We’re looking for strong Software Engineers that can combine EMR, Redshift, Hadoop, Spark, Kafka, Elastic Search, Tensorflow, Pytorch and other technologies to build the next generation Data Platform, ML Platform, Experimentation Platform. If this sounds interesting we’d love to hear from you!
This role will involve designing and developing software products that impact many areas of our business. The individual in this role will have responsibility help define requirements, create software designs, implement code to these specifications, provide thorough unit and integration testing, and support products while deployed and used by our stakeholders.
Key Responsibilities:
Participate, Own & Influence in architecting & designing of systems
Collaborate with other engineers, data scientists, product managers
Build intelligent systems that drive decisions
Build systems that enable us to perform experiments and iterate quickly
Build platforms that enable scientists to train, deploy and monitor models at scale
Build analytical systems that drives better decision making
Required Skills:
Programming experience with at least one modern language such as Java, Scala including object-oriented design
Experience in contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems
Bachelor’s degree in Computer Science or related field
Computer Science fundamentals in object-oriented design
Computer Science fundamentals in data structures
Computer Science fundamentals in algorithm design, problem solving, and complexity analysis
Experience in databases, analytics, big data systems or business intelligence products:
Data lake, data warehouse, ETL, ML platform
Big data tech like: Hadoop, Apache Spark
Who we are:
Stanza Living is India’s largest and fastest growing tech-enabled, managed accommodation company that delivers a hospitality-led living experience to migrant students and young working professionals across India.
We have a full-stack business model that focuses on design, development and delivery of daily living solutions tailored to the young consumers’ lifestyle. From smartly-planned
residences, host of amenities and services for hassle-free living to exclusive community engagement programmes – everything is seamlessly integrated through technology to ensure the highest consumer delight.
Today, we are:
• India’s largest managed accommodation company with over 50,000 beds under management across 24+ cities
• Most capitalized player in the managed accommodation space, backed by global marquee investors – Falcon Edge, Equity International, Sequoia Capital, Matrix Partners, Accel Partners
• Recognized as the Best Real Estate Tech company across the Globe in 2020 by leading analysis agency, Tracxn
• LinkedIn Top Startup to Work for - 2022
The opportunity: Job Responsibilities:
• Perform data analysis on large volumes of data to identify trends and/or data processing rules
• Team player of core analytics team.
• Responsible for weekly and monthly Sales/Marketing Reports on Gross and Net basis and other adhoc reports.
• Generating reports on daily basis at all stages ·
• Analyze the data to come out with insights on what leads to better conversions, student preferences, role on various
investments and channels, optimizing the spend etc.
• Prepare reports and dashboard for various business functions to keep track of important business metrics.
• Elicit and document requirements at various levels including Business, Logical and Physical/Technical
Skill Sets
• Good hands-on Advanced Excel & SQL.
• Has extensively worked on live Dashboards, reporting, data manipulation and making flat tables in SQL
• Knowledge in Python/R
• Strong analytical skills and ability to interpret data
• Natural curiosity and self-drive to understand the broader business in order to provide the appropriate reporting support
• Extremely high ownership, self-starter and work in a constantly-changing and fast-growing environment
• Establish collaborative and trusting relationships with the business’s key internal leaders and stakeholders in order to ensure that there is a free flow of ideas and information across the business
• First principle thinking and strong problem solving
What Can You Expect:
• A phenomenal work environment, with extremely high ownership and growth opportunities
• Opportunity to shape a potential unicorn
• Quick iterations and deployments - fail-fast attitude
• Opportunity to work on cutting-edge technologies
• Access to a world-class mentorship network
A Business Transformation Organization that partners with businesses to co–create customer-centric hyper-personalized solutions to achieve exponential growth. Invente offers platforms and services that enable businesses to provide human-free customer experience, Business Process Automation.
Location: Hyderabad (WFO)
Budget: Open
Position: Azure Data Engineer
Experience: 5+ years of commercial experience
Responsibilities
● Design and implement Azure data solutions using ADLS Gen 2.0, Azure Data Factory, Synapse, Databricks, SQL, and Power BI
● Build and maintain data pipelines and ETL processes to ensure efficient data ingestion and processing
● Develop and manage data warehouses and data lakes
● Ensure data quality, integrity, and security
● Implement from existing use cases required by the AI and analytics teams.
● Collaborate with other teams to integrate data solutions with other systems and applications
● Stay up-to-date with emerging data technologies and recommend new solutions to improve our data infrastructure
Familiar with the MicroStrategy architecture, Admin Certification Preferred
· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes
· Monitor and manage existing Business Intelligence development/production systems
· MicroStrategy installation, upgrade and administration on Windows and Linux platform
· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.
· Analyze application and system logs while troubleshooting and root cause analysis
· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.
· Monitor, report and investigate solutions to improve report performance.
· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.
· Provide support for the platform, report execution and implementation, user community and data investigations.
· Identify improvement areas in Environment hosting and upgrade processes.
· Identify automation opportunities and participate in automation implementations
· Provide on-call support for Business Intelligence issues
· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.
· Familiar with AWS, Linux Scripting
· Knowledge of MSTR Mobile
· Knowledge of capacity planning and system’s scaling needs
Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers.
- Performing extensive analysis on SQL, Google Analytics & Excel from a product standpoint to provide quick recommendations to the management
- Establishing scalable, efficient and automated processes to deploy data analytics on large data sets across platforms
What you need to have:
- B.Tech /B.E.; Any Graduation
- Strong background in statistical concepts & calculations to perform analysis/ modeling
- Proficient in SQL and other BI tools like Tableau, Power BI etc.
- Good knowledge of Google Analytics and any other web analytics platforms (preferred)
- Strong analytical and problem solving skills to analyze large quantum of datasets
- Ability to work independently and bring innovative solutions to the team
- Experience of working with a start-up or a product organization (preferred)
Roles and
Responsibilities
Seeking AWS Cloud Engineer /Data Warehouse Developer for our Data CoE team to
help us in configure and develop new AWS environments for our Enterprise Data Lake,
migrate the on-premise traditional workloads to cloud. Must have a sound
understanding of BI best practices, relational structures, dimensional data modelling,
structured query language (SQL) skills, data warehouse and reporting techniques.
Extensive experience in providing AWS Cloud solutions to various business
use cases.
Creating star schema data models, performing ETLs and validating results with
business representatives
Supporting implemented BI solutions by: monitoring and tuning queries and
data loads, addressing user questions concerning data integrity, monitoring
performance and communicating functional and technical issues.
Job Description: -
This position is responsible for the successful delivery of business intelligence
information to the entire organization and is experienced in BI development and
implementations, data architecture and data warehousing.
Requisite Qualification
Essential
-
AWS Certified Database Specialty or -
AWS Certified Data Analytics
Preferred
Any other Data Engineer Certification
Requisite Experience
Essential 4 -7 yrs of experience
Preferred 2+ yrs of experience in ETL & data pipelines
Skills Required
Special Skills Required
AWS: S3, DMS, Redshift, EC2, VPC, Lambda, Delta Lake, CloudWatch etc.
Bigdata: Databricks, Spark, Glue and Athena
Expertise in Lake Formation, Python programming, Spark, Shell scripting
Minimum Bachelor’s degree with 5+ years of experience in designing, building,
and maintaining AWS data components
3+ years of experience in data component configuration, related roles and
access setup
Expertise in Python programming
Knowledge in all aspects of DevOps (source control, continuous integration,
deployments, etc.)
Comfortable working with DevOps: Jenkins, Bitbucket, CI/CD
Hands on ETL development experience, preferably using or SSIS
SQL Server experience required
Strong analytical skills to solve and model complex business requirements
Sound understanding of BI Best Practices/Methodologies, relational structures,
dimensional data modelling, structured query language (SQL) skills, data
warehouse and reporting techniques
Preferred Skills
Required
Experience working in the SCRUM Environment.
Experience in Administration (Windows/Unix/Network/
plus.
Experience in SQL Server, SSIS, SSAS, SSRS
Comfortable with creating data models and visualization using Power BI
Hands on experience in relational and multi-dimensional data modelling,
including multiple source systems from databases and flat files, and the use of
standard data modelling tools
Ability to collaborate on a team with infrastructure, BI report development and
business analyst resources, and clearly communicate solutions to both
technical and non-technical team members
As a Power BI and QlikView Developer, we expect the candidate to be a key contributor in the implementation of data analytics dashboards – from data preparation to dashboard development, unit testing and deployment. The primary work focus for the candidate is as under:
- Understanding the database design.
- Develop efficient SQL queries from simple to complex and test the data output as a part of data preparation activity
- Development & Unit Testing – Dashboards & Data Visualizations using Power BI
- Troubleshooting/debugging and rectifying issues
- Review, feedback and mentoring the team
- Adherence of standards and best practices as defined by the company at the individual level and as a team
QUALIFICATIONS AND EXPERIENCE
- Degree in BE/ BTech with at least 3 to 5 years of overall experience
- Experience in working on multiple databases like MS SQL (Mandatory), PostgreSQL, Mongo DB, MySQL (would be a plus) etc. and data analytics projects (a minimum of 2 to 3).
- Experience working on SQL Server: Writing queries of 1 to 2 years is mandatory.
- Expertise in Power BI and knowledge of QlikView (1-2 years).
- Understanding other tools like Tableau, Domo etc. would be a great plus
- Experience in working on different types of visualizations, in addition to the generic ones – Scatter plots, Heat Maps, Geo maps, Gantt, Bubbles, Tree Maps etc.
- Experience working on Trends, forecasting etc. would be a plus
- Experience in working with projects teams and ensuring the successful delivery of the solution
Do you have a passion for using your skills to develop innovative technologies? Are you interested in working on a team of professionals at a globally respected education organization and using your talents for building solutions that help thousands of students achieve success? If so, join us and take your career to the next level. We are building a team of talented individuals to work on innovative products for education.
The Report Developer is a key contributor for information / data management solutions. Conceptualizes, designs and manages a wide range of business, academic and digital analytic solutions. Executes multiple projects including data sourcing, migration, quality, design, and implementation. The Report Developer partners with BI/Technology, Data Science, marketing/media, student enrollment advisors, faculty, academic administrators, strategy, planning and creative/user experience teams to plan, execute and evaluate a broad range of business, academic, marketing, and operations initiatives. Identifies best practices in producing reliable dashboards, analytics, and reporting. This role requires stakeholder interaction, thus appropriate service-orientation and communications skills are required.
Successful candidates will have good communication and presentation skills, critical thinking skills, and the ability to break down complex problems. Quick learners are preferred over many years of experience, and associates who aspire to make a difference over those who aim to just fill orders.
Key Responsibilities:
-
Collaborates with SP&A team members / client groups to determine technical requirements of pending or current projects and develops analytical plans including proper selection of methodologies, techniques, KPIs and metrics.
-
Ensures assigned work is completed for on time delivery.
-
Performs extraction of data sets from multiple sources/platforms including: student systems/data warehouse, web analytic tools, social listening tools, search tools, syndicated data, research & survey tools, etc. Implements hygiene and quality control steps.
-
Conducts time series data analysis, segmentation, various metrics and key performance indicators, research & test design, significance testing, variance and growth calculations, forecasting, return on investment.
-
Applies industry best practices in research, cutting edge analytic solutions with big data platforms and software to efficiently and effectively manage data.
-
Supports training within and across teams on complex data and infrastructure topics.
-
Builds business intelligence reporting, adhoc solutions and / or dashboards across various platforms.
-
Develops and presents organized and clearly articulated analysis for stakeholder presentations.
-
Applies knowledge of multi-channel marketing/research/academic learning principles.
-
Participates in the full lifecycle of Business Intelligence and Data Warehousing development process.
Qualifications:
-
Bachelor's Degree (B.A./B.S.) or Master’s Degree (M.A./M.S.) in Computer Science, Decision Sciences, Information Management, or related fields.
-
2-5 years of relevant experience in business, marketing, or academic reporting and analytics.
-
Strong analytical and critical thinking skills.
-
Strong written, verbal communications as well as presentation skills
-
Advanced knowledge of data specifications, data governance, data warehouse and data structures
-
Proactive self-starter who can work collaboratively with cross-capability team members
-
Advanced knowledge and competency of Excel
-
In-depth knowledge of SQL, Tableau, Microsoft SQL Server, Reporting Services, Analysis Services, Transact SQL, Bower BI
-
Familiarity with cloud based technologies, a definite asset