We have the following opening in our organization:
Years of Experience: Experience of 4-8 years.
Location- Mumbai ( Thane)/BKC/Andheri
Notice period: Max 15 days or Immediate
Educational Qualification: MCA/ME/Msc-IT/BE/B-Tech/BCA/BSC IT in Computer Science/B.Tech
- 3- 8 years of consulting or relevant work experience
- Should be good in SQL Server 2008 R2 and above.
- Should be excellent at SQL, SSRS & SSIS, SSAS,
- Data modeling, Fact & dimension design, work on a data warehouse or dw architecture design.
- Implementing new technology like power BI, power bi modeling.
- Knowledge of Azure or R-programming is an added advantage.
- Experiences in BI and Visualization Technology (Tableau, Power BI).
- Advanced T-SQL programming skill
- Can scope out a simple or semi-complex project based on business requirements and achievable benefits
- Evaluate, design, and implement enterprise IT-based business solutions, often working on-site to help customers deploy their solutions.
About NeoQuant Solutions Pvt Ltd
Our client combines Adtech and Martech platform strategy with data science & data engineering expertise, helping our clients make advertising work better for people.
- Act as primary day-to-day contact on analytics to agency-client leads
- Develop bespoke analytics proposals for presentation to agencies & clients, for delivery within the teams
- Ensure delivery of projects and services across the analytics team meets our stakeholder requirements (time, quality, cost)
- Hands on platforms to perform data pre-processing that involves data transformation as well as data cleaning
- Ensure data quality and integrity
- Interpret and analyse data problems
- Build analytic systems and predictive models
- Increasing the performance and accuracy of machine learning algorithms through fine-tuning and further
- Visualize data and create reports
- Experiment with new models and techniques
- Align data projects with organizational goals
- Min 6 - 7 years’ experience working in Data Science
- Prior experience as a Data Scientist within a digital media is desirable
- Solid understanding of machine learning
- A degree in a quantitative field (e.g. economics, computer science, mathematics, statistics, engineering, physics, etc.)
- Experience with SQL/ Big Query/GMP tech stack / Clean rooms such as ADH
- A knack for statistical analysis and predictive modelling
- Good knowledge of R, Python
- Experience with SQL, MYSQL, PostgreSQL databases
- Knowledge of data management and visualization techniques
- Hands-on experience on BI/Visual Analytics Tools like PowerBI or Tableau or Data Studio
- Evidence of technical comfort and good understanding of internet functionality desirable
- Analytical pedigree - evidence of having approached problems from a mathematical perspective and working through to a solution in a logical way
- Proactive and results-oriented
- A positive, can-do attitude with a thirst to continually learn new things
- An ability to work independently and collaboratively with a wide range of teams
- Excellent communication skills, both written and oral
As a part of the Data Science & Analytics team at Rupifi, you will play a significant role in helping define the business/product vision and deliver it from the ground up by working with passionate
high-performing individuals in a very fast-paced working environment.
You will work closely with Data Scientists & Analysts, Engineers, Designers, Product Managers, Ops Managers and Business Leaders, and help the team make informed data-driven decisions and deliver high business impact.
1. Use statistical and machine learning techniques to create scalable risk management systems
2. Design, develop and evaluate highly innovative models for risk management
3. Establish scalable, efficient and automated processes for model development, model
validation and model implementation
4. Analyse data to better understand potential risks, concerns and outcomes of decisions
5. Aggregate data from multiple sources to provide a comprehensive assessment
6. Create reports, presentations and process documents to display impactful results
7. Collaborate with other team members to effectively analyze and present data
8. Develop insights and data visualizations to solve complex problems and communicate ideas to stakeholders
● Hands-on experience in Python/R & SQL
● Hands-on experience in Machine & Deep Learning area (e.g., gradient boosting machine, XGBoost, neural network), AI, and deep learning as well as classic statistical modeling techniques and assumptions
● Experience in handling complex and large data sources
● Experience in modeling techniques in the fintech/banking domain
● Experience in working on Big data and distributed computing
● Bachelors / Masters's degree in Maths, Data Science, Computer Science, Engineering, Statistics, Economics or similar quantitative field
● 4 to 10 years of modeling experience in the fintech/banking domain in fields like collections, underwriting, customer management, etc.
● Strong analytical skills with good problem-solving ability
● Strong presentation and communication skills
● Experience in working on advanced machine learning techniques
● Quantitative and analytical skills with a demonstrated ability to understand new analytical concepts
We are looking for a motivated data analyst with sound experience in handling web/ digital analytics, to join us as part of the Kapiva D2C Business Team. This team is primarily responsible for driving sales and customer engagement on our website. This channel has grown 5x in revenue over the last 12 months and is poised to grow another 5x over the next six. It represents a high-growth, important part of our overall e-commerce growth strategy.
The mandate here is to run an end-to-end sustainable e-commerce business, boost sales through marketing campaigns, and build a cutting edge product (website) that optimizes the customer’s journey as well as increases customer lifetime value.
The Data Analyst will support the business heads by providing data-backed insights in order to drive customer growth, retention and engagement. They will be required to set-up and manage reports, test various hypotheses and coordinate with various stakeholders on a day-to-day basis.
Strategy and planning:
● Work with the D2C functional leads and support analytics planning on a quarterly/ annual basis
● Identify reports and analytics needed to be conducted on a daily/ weekly/ monthly frequency
● Drive planning for hypothesis-led testing of key metrics across the customer funnel
● Interpret data, analyze results using statistical techniques and provide ongoing reports
● Analyze large amounts of information to discover trends and patterns
● Work with business teams to prioritize business and information needs
● Collaborate with engineering and product development teams to setup data infrastructure as needed
Reporting and communication:
● Prepare reports / presentations to present actionable insights that can drive business objectives
● Setup live dashboards reporting key cross-functional metrics
● Coordinate with various stakeholders to collect useful and required data
● Present findings to business stakeholders to drive action across the organization
● Propose solutions and strategies to business challenges
● Bachelor’s/ Masters in Mathematics, Economics, Computer Science, Information Management, Statistics or related field
● High proficiency in MS Excel and SQL
● Knowledge of one or more programming languages like Python/ R. Adept at queries, report writing and presenting findings
● Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy - working knowledge of statistics and statistical methods
● Ability to work in a highly dynamic environment across cross-functional teams; good at
coordinating with different departments and managing timelines
● Exceptional English written/verbal communication
● A penchant for understanding consumer traits and behavior and a keen eye to detail
Good to have:
● Hands-on experience with one or more web analytics tools like Google Analytics, Mixpanel, Kissmetrics, Heap, Adobe Analytics, etc.
● Experience in using business intelligence tools like Metabase, Tableau, Power BI is a plus
● Experience in developing predictive models and machine learning algorithms
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
● Working on an awesome AI product for the eCommerce domain.
● Build the next-generation information extraction, computer vision product powered
by state-of-the-art AI and Deep Learning techniques.
● Work with an international top-notch engineering team with full commitment to
Machine Learning development.
Desired Candidate Profile
● Passionate about search & AI technologies. Open to collaborating with colleagues &
● Good understanding of the mainstream deep learning models from multiple domains:
computer vision, NLP, reinforcement learning, model optimization, etc.
● Hands-on experience on deep learning frameworks, e.g. Tensorflow, Pytorch, MXNet,
BERT. Able to implement the latest DL model using existing API, open-source libraries
in a short time.
● Hands-on experience with the Cloud-Native techniques. Good understanding of web
services and modern software technologies.
● Maintained/contributed machine learning projects, familiar with the agile software
development process, CICD workflow, ticket management, code-review, version
● Skilled in the following programming languages: Python 3.
● Good English skills especially for writing and reading documentation
We are looking for
A Senior Software Development Engineer (SDE2) who will be instrumental in the design and development of our backend technology, which manages our exhaustive data pipelines and AI models. Simplifying complexity and building technology that is robust and scalable is your North Star. You'll work closely alongside our CTO and machine learning engineers, frontend and wider technical team to build new capabilities, focused on speed and reliability.
You'll own your work, to build, test and iterate quickly, with direct guidance from our CTO.
Please note: You must have proven industry experience greater than 2 years.
Your work includes
- Own and manage the whole engineering infrastructure that supports Greendeck platform.
- Work to create highly scalable, highly robust and highly available python micro-services.
- Design the architecture to stream data on a huge scale across multiple services.
- Create and manage data pipelines using tools like Kafka, Celery.
- Deploy Serverless functions to process and manage data.
- Work with variety of databases and storage systems to store and strategically manage data.
- Write connections to collect data from various third party services, data storages and APIs.
- Strong experience in python creating scripts or apps or services
- Strong automation and scripting skills
- Knowledge of at least one SQL and No-SQL Database
- Experience of working with messaging systems like Kafka, RabbitMQ
- Good knowledge about data-frames and data-manipulation
- Have used and deployed apps using FastAPI or Flask or similar tech
- Knowledge of CI/CD paradigm
- Basic knowledge about Docker
- Have knowledge of creating and using REST APIs
- Good knowledge of OOP Fundamentals.
- (Optional) Knowledge about Celery/ Airflow
- (Optional) Knowledge about Lambda/ Serverless
- (Optional) Have connected apps using OAuth
What you can expect
- Attractive pay, bonus scheme and flexible vacation policy.
- A truly flexible, trust-based, performance driven work culture.
- Lunch is on us, everyday!
- A young and passionate team building elegant products with intricate technology for the future of businesses around the world. Our average age is 25!
- The chance to make a huge difference to the success of a world-class SaaS product and the opportunity to make an impact.
Its important to us
- That you relocate to Indore
- That you have a minimum of 2 years of experience working as a Software Developer
We cater to a wide range of entertainment categories including video streaming, music streaming, games and short videos via our MX Player and MX Takatak apps which are our flagship products.
Both MX Player and MX Takatak iOS apps are frequently featured amongst the top 5 apps in the Entertainment category on the Indian App Store. These are built by a small team of engineers based in Mumbai.
Roles and responsibility for the same will be:
- Year of experience – 5+ years
- Ability to synthesize complex data into actionable goals.
- Interpersonal skills to work collaboratively with various stakeholders with competing interests.
- Find analytical trends and logics to better promote content on the Platform
- Evaluate content and carry out qualitative research on Competition Analysis, Viewership trends for OTT.
- Building hypothesis and testing it out through relevant KPIs.
- Communicate the key findings to programming stakeholders and inculcating best practices to programming team.
Skills & Competencies:
- Good knowledge of BigQuery, SQL, Python/R, MS Excel and Data Infrastructure.
- Ability to work as a team player in a target driven work environment meeting deadline.
- Excellent Time Management Skills.
- Strong Communication Skills.
- Proficient in MS Office Suite
About the role
- Collaborating with a team of like-minded and experienced engineers for Tier 1 customers, you will focus on data engineering on large complex data projects. Your work will have an impact on platforms that handle crores of customers and millions of transactions daily.
- As an engineer, you will use the latest cloud services to design and develop reusable core components and frameworks to modernise data integrations in a cloud first world and own those integrations end to end working closely with business units. You will design and build for efficiency, reliability, security and scalability. As a consultant, you will help drive a data engineering culture and advocate best practices.
- 1-6 years of relevant experience
- Strong SQL skills and data literacy
- Hands-on experience designing and developing data integrations, either in ETL tools, cloud native tools or in custom software
- Proficiency in scripting and automation (e.g. PowerShell, Bash, Python)
- Experience in an enterprise data environment
- Strong communication skills
- Ability to work on data architecture, data models, data migration, integration and pipelines
- Ability to work on data platform modernisation from on-premise to cloud-native
- Proficiency in data security best practices
- Stakeholder management experience
- Positive attitude with the flexibility and ability to adapt to an ever-changing technology landscape
- Desire to gain breadth and depth of technologies to support customer's vision and project objectives
What to expect if you join Servian?
- Learning & Development: We invest heavily in our consultants and offer internal training weekly (both technical and non-technical alike!) and abide by a ‘You Pass We Pay” policy.
- Career progression: We take a longer term view of every hire. We have a flat org structure and promote from within. Every hire is developed as a future leader and client adviser.
- Variety of projects: As a consultant, you will have the opportunity to work across multiple projects across our client base significantly increasing your skills and exposure in the industry.
- Great culture: Working on the latest Apple MacBook pro in our custom designed offices in the heart of leafy Jayanagar, we provide a peaceful and productive work environment close to shops, parks and metro station.
- Professional development: We invest heavily in professional development both technically, through training and guided certification pathways, and in consulting, through workshops in client engagement and communication. Growth in our organisation happens from the growth of our people.
we are looking for candidates who have good experiance with
BI/DW Experience of 3 - 6 years with Spark, Scala, SQL expertise
Azure background is needed.
* Spark hands on : Must have
* Scala hands on : Must have
* SQL expertise : Expert
* Azure background : Must have
* Python hands on : Good to have
* ADF, Data Bricks: Good to have
* Should be able to communicate effectively and deliver technology
implementation end to end
Looking for candidates who can join 15 to 30 Days and who will avaailable immeiate.
Fragma Data Systems
- Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
- Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
- Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
- Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
- Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
- Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree and Random forest Algorithms.
- PolyBase queries for exporting and importing data into Azure Data Lake.
- Building data models both tabular and multidimensional using SQL Server data tools.
- Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
- Programming experience using python libraries NumPy, Pandas and Matplotlib.
- Implementing NOSQL databases and writing queries using cypher.
- Designing end user visualizations using Power BI, QlikView and Tableau.
- Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
- Experience using the expression languages MDX and DAX.
- Experience in migrating on-premise SQL server database to Microsoft Azure.
- Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
- Performance tuning complex SQL queries, hands on experience using SQL Extended events.
- Data modeling using Power BI for Adhoc reporting.
- Raw data load automation using T-SQL and SSIS
- Expert in migrating existing on-premise database to SQL Azure.
- Experience in using U-SQL for Azure Data Lake Analytics.
- Hands on experience in generating SSRS reports using MDX.
- Experience in designing predictive models using Python and SQL Server.
- Developing machine learning models using Azure Databricks and SQL Server