Must have experience in BFSC domain.
Exp- Min 3yrs
Mandatory Skills- Exp in powerBI/Tableau,
About Reality Premedia Services Pvt Ltd
Roles & Responsibilities:
- Prepare and maintain E-commerce reports.
- Performing analysis to assess quality and meaning of data
- Preparing reports for the management stating trends, patterns, and predictions using relevant data
- Streamlining communication channels between Sales, Operations, Accounts and other teams.
- Build relationships with Marketplace Sellers/logistics partners/third party vendor/ payment gateways.
- Assigning numerical value to essential business functions so that business performance can be assessed and compared over periods of time.
- Assist Marketplace Seller to List product and execute orders.
- Process improvement and simplification.
- Maintain TAT as per set guidelines, to ensure smooth and timely operations.
- Creation of monthly & quarterly PnL
- Overall Experience: Minimum 2 years of experience in data analyst.
- Industry Experience: Fashion / Lifestyle / Retail
- Educational Qualification: Graduation / Post Graduation
- Behavioural Skills: Effective communication skills, prompt & proactive approach and eye for detail.
- Technical Skills: MS Office, Ecommerce metrics and customer focused.
- High-energy, results oriented individual to manage to work in a fast paced environment with quickly changing priorities, strategic approach
Job Location: Bangalore/Mumbai
Our client is the world’s largest media investment company and are a part of WPP. In fact, they are responsible for one in every three ads you see globally. We are currently looking for a Senior Software Engineer to join us. In this role, you will be responsible for coding/implementing of custom marketing applications that Tech COE builds for its customer and managing a small team of developers.
What your day job looks like:
- Serve as a Subject Matter Expert on data usage – extraction, manipulation, and inputs for analytics
- Develop data extraction and manipulation code based on business rules
- Develop automated and manual test cases for the code written
- Design and construct data store and procedures for their maintenance
- Perform data extract, transform, and load activities from several data sources.
- Develop and maintain strong relationships with stakeholders
- Write high quality code as per prescribed standards.
- Participate in internal projects as required
- B. Tech./MCA or equivalent preferred
- Excellent 3 years Hand on experience on Big data, ETL Development, Data Processing.
What you’ll bring:
- Strong experience in working with Snowflake, SQL, PHP/Python.
- Strong Experience in writing complex SQLs
- Good Communication skills
- Good experience of working with any BI tool like Tableau, Power BI.
- Sqoop, Spark, EMR, Hadoop/Hive are good to have.
Data Scientist (Risk)/Sr. Data Scientist (Risk)
As a part of the Data science/Analytics team at Rupifi, you will play a significant role in helping define the business/product vision and deliver it from the ground up by working with passionate high-performing individuals in a very fast-paced working environment.
You will work closely with Data Scientists & Analysts, Engineers, Designers, Product Managers, Ops Managers and Business Leaders, and help the team make informed data driven decisions and deliver high business impact.
Preferred Skills & Responsibilities:
- Analyze data to better understand potential risks, concerns and outcomes of decisions.
- Aggregate data from multiple sources to provide a comprehensive assessment.
- Past experience of working with business users to understand and define inputs for risk models.
- Ability to design and implement best in class Risk Models in Banking & Fintech domain.
- Ability to quickly understand changing market trends and incorporate them into model inputs.
- Expertise in statistical analysis and modeling.
- Ability to translate complex model outputs into understandable insights for business users.
- Collaborate with other team members to effectively analyze and present data.
- Conduct research into potential clients and understand the risks of accepting each one.
- Monitor internal and external data points that may affect the risk level of a decision.
- Hands-on experience in Python & SQL.
- Hands-on experience in any visualization tool preferably Tableau
- Hands-on experience in Machine & Deep Learning area
- Experience in handling complex data sources
- Experience in modeling techniques in the fintech/banking domain
- Experience of working on Big data and distributed computing.
- A BTech/BE/MSc degree in Math, Engineering, Statistics, Economics, ML, Operations Research, or similar quantitative field.
- 3 to 10 years of modeling experience in the fintech/banking domain in fields like collections, underwriting, customer management, etc.
- Strong analytical skills with good problem solving ability
- Strong presentation and communication skills
- Experience in working on advanced machine learning techniques
- Quantitative and analytical skills with a demonstrated ability to understand new analytical concepts.
Cloudera Data Warehouse Hive team looking for a passionate senior developer to join our growing engineering team. This group is targeting the biggest enterprises wanting to utilize Cloudera’s services in a private and public cloud environment. Our product is built on open source technologies like Hive, Impala, Hadoop, Kudu, Spark and so many more providing unlimited learning opportunities.A Day in the LifeOver the past 10+ years, Cloudera has experienced tremendous growth making us the leading contributor to Big Data platforms and ecosystems and a leading provider for enterprise solutions based on Apache Hadoop. You will work with some of the best engineers in the industry who are tackling challenges that will continue to shape the Big Data revolution. We foster an engaging, supportive, and productive work environment where you can do your best work. The team culture values engineering excellence, technical depth, grassroots innovation, teamwork, and collaboration.
You will manage product development for our CDP components, develop engineering tools and scalable services to enable efficient development, testing, and release operations. You will be immersed in many exciting, cutting-edge technologies and projects, including collaboration with developers, testers, product, field engineers, and our external partners, both software and hardware vendors.Opportunity:Cloudera is a leader in the fast-growing big data platforms market. This is a rare chance to make a name for yourself in the industry and in the Open Source world. The candidate will responsible for Apache Hive and CDW projects. We are looking for a candidate who would like to work on these projects upstream and downstream. If you are curious about the project and code quality you can check the project and the code at the following link. You can start the development before you join. This is one of the beauties of the OSS world.Apache Hive
•Build robust and scalable data infrastructure software
•Design and create services and system architecture for your projects
•Improve code quality through writing unit tests, automation, and code reviews
•The candidate would write Java code and/or build several services in the Cloudera Data Warehouse.
•Worked with a team of engineers who reviewed each other's code/designs and held each other to an extremely high bar for the quality of code/designs
•The candidate has to understand the basics of Kubernetes.
•Build out the production and test infrastructure.
•Develop automation frameworks to reproduce issues and prevent regressions.
•Work closely with other developers providing services to our system.
•Help to analyze and to understand how customers use the product and improve it where necessary.
•Deep familiarity with Java programming language.
•Hands-on experience with distributed systems.
•Knowledge of database concepts, RDBMS internals.
•Knowledge of the Hadoop stack, containers, or Kubernetes is a strong plus.
•Has experience working in a distributed team.
•Has 3+ years of experience in software development.
Senior Data Scientist
- 6+ years Experienced in building data pipelines and deployment pipelines for machine learning models
- 4+ years’ experience with ML/AI toolkits such as Tensorflow, Keras, AWS Sagemaker, MXNet, H20, etc.
- 4+ years’ experience developing ML/AI models in Python/R
- Must have leadership abilities to lead a project and team.
- Must have leadership skills to lead and deliver projects, be proactive, take ownership, interface with business, represent the team and spread the knowledge.
- Strong knowledge of statistical data analysis and machine learning techniques (e.g., Bayesian, regression, classification, clustering, time series, deep learning).
- Should be able to help deploy various models and tune them for better performance.
- Working knowledge in operationalizing models in production using model repositories, API s and data pipelines.
- Experience with machine learning and computational statistics packages.
- Experience with Data Bricks, Data Lake.
- Experience with Dremio, Tableau, Power Bi.
- Experience working with spark ML, spark DL with Pyspark would be a big plus!
- Working knowledge of relational database systems like SQL Server, Oracle.
- Knowledge of deploying models in platforms like PCF, AWS, Kubernetes.
- Good knowledge in Continuous integration suites like Jenkins.
- Good knowledge in web servers (Apache, NGINX).
- Good knowledge in Git, Github, Bitbucket.
- Working knowledge in operationalizing models in production using model repositories, APIs and data pipelines.
- Java, R, and Python programming experience.
- Should be very familiar with (MS SQL, Teradata, Oracle, DB2).
- Big Data – Hadoop.
- Expert knowledge using BI tools e.g.Tableau
- Experience with machine learning and computational statistics packages.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
- Expertise of SQL/PL-SQL -ability to write procedures and create queries for reporting purpose.
- Must have worked on a reporting tool – Power BI/Tableau etc.
- Strong knowledge of excel/Google Sheets – must have worked with pivot tables, aggregate functions, logical if conditions.
- Strong verbal and written communication skills for coordination with departments.
- An analytical mind and inclination for problem-solving
Strong knowledge in Power BI (DAX + Power Query + Power BI Service + Power BI
Desktop Visualisations) and Azure Data Storages.
Should have experience in Power BI mobile Dashboards.
Strong knowledge in SQL.
Good knowledge of DWH concepts.
Work as an independent contributor at the client location.
Implementing Access Control and impose required Security.
Candidate must have very good communication skills.
- Create, maintain and automate datasets and insightful dashboards to track core metrics and extract business insights
- Analyze large-scale structured and unstructured data to identify business opportunities and optimize features for Analytics.
- 8+ years experience doing Business Intelligence and Analytics work
- B.Tech/M.Tech in a technical field (Computer Science, Math, Statistics)
- Strong knowledge in Data Design, Data Modelling, and Data Validation best practices
- Proficient in data visualization, Preferably SSRS & Power BI
- Fluency with writing advanced SQL code
- Experience with SQL Server Administration and best practices
- Possess BFSI, Fintech Domain Knowledge
- Excellent interpersonal, cross-functional, communication, writing, and presentation skills
- Comfortable working in a fast-paced environment with the ability to be a team player
- Possess excellent Project Management and superior Team Management skills