We are looking for a Data Analyst that oversees organisational data analytics. This will require you to design and help implement the data analytics platform that will keep the organisation running. The team will be the go-to for all data needs for the app and we are looking for a self-starter who is hands on and yet able to abstract problems and anticipate data requirements.
This person should be very strong technical data analyst who can design and implement data systems on his own. Along with him, he also needs to be proficient in business reporting and should have keen interest in provided data needed for business.
Tools familiarity: SQL, Python, Mix panel, Metabase, Google Analytics, Clever Tap, App Analytics
- Processes and frameworks for metrics, analytics, experimentation and user insights, lead the data analytics team
- Metrics alignment across teams to make them actionable and promote accountability
- Data based frameworks for assessing and strengthening Product Market Fit
- Identify viable growth strategies through data and experimentation
- Experimentation for product optimisation and understanding user behaviour
- Structured approach towards deriving user insights, answer questions using data
- This person needs to closely work with Technical and Business teams to get this implemented.
- 4 to 6 years at a relevant role in data analytics in a Product Oriented company
- Highly organised, technically sound & good at communication
- Ability to handle & build for cross functional data requirements / interactions with teams
- Great with Python, SQL
- Can build, mentor a team
- Knowledge of key business metrics like cohort, engagement cohort, LTV, ROAS, ROE
BTech or MTech in Computer Science/Engineering from a Tier1, Tier2 colleges
Good knowledge on Data Analytics, Data Visualization tools. A formal certification would be added advantage.
We are more interested in what you CAN DO than your location, education, or experience levels.
Send us your code samples / GitHub profile / published articles if applicable.
1. Working on supervised and unsupervised learning algorithms
2. Developing deep learning and machine learning algorithms
3. Working on live projects on data analytics
● Knowledge of Excel,SQL and writing code in python.
● Experience with Reporting and Business Intelligence tools like Tableau, Metabase.
● Exposure with distributed analytics processing technologies is desired (e.g. Hive, Spark).
● Experience with Clevertap, Mixpanel, Amplitude, etc.
● Excellent communication skills.
● Background in market research and project management.
● Attention to detail.
● Problem-solving aptitude.
● Working on an awesome AI product for the eCommerce domain.
● Build the next-generation information extraction, computer vision product powered
by state-of-the-art AI and Deep Learning techniques.
● Work with an international top-notch engineering team with full commitment to
Machine Learning development.
Desired Candidate Profile
● Passionate about search & AI technologies. Open to collaborating with colleagues &
● Good understanding of the mainstream deep learning models from multiple domains:
computer vision, NLP, reinforcement learning, model optimization, etc.
● Hands-on experience on deep learning frameworks, e.g. Tensorflow, Pytorch, MXNet,
BERT. Able to implement the latest DL model using existing API, open-source libraries
in a short time.
● Hands-on experience with the Cloud-Native techniques. Good understanding of web
services and modern software technologies.
● Maintained/contributed machine learning projects, familiar with the agile software
development process, CICD workflow, ticket management, code-review, version
● Skilled in the following programming languages: Python 3.
● Good English skills especially for writing and reading documentation
- Architect and design for our customers' data-driven applications and solutions and own back-end technology
- Develop architectures that are inherently secure, robust, scalable, modular, and API-centric
- Build distributed backend systems serving real-time analytics and machine learning features at scale
- Own the scalability, performance, and performance metrics of complex distributed systems.
- Apply architecture best practices that help increase execution velocity
- Collaborate with the key stakeholders, like business, product, and other technology teams
- Mentor junior members in the team
- Excellent Academic Background (MS/B.Tech from a top tier university)
- 6-10 years of experience in backend architecture and development with large data volumes
- Extensive hands-on experience in the Big Data Ecosystem (like Hadoop, Spark, Presto, Hive), Database (like
- MySQL, PostgreSQL), NoSQL (like MongoDB, Cassandra), and Data Warehousing like Redshift
- Experience in cloud-based technology solutions with scale and robustness
- Strong data management and migration experience including proficiency in data warehousing, data quality, and analysis.
- Experience in the development of microservices/REST APIs
- Experience with Agile and DevOps development methodology and tools like Jira, Confluence
- Understanding/exposure to complete product development cycle
- Strong Python Coding skills and OOP skills
- Should have worked on Big Data product Architecture
- Should have worked with any one of the SQL-based databases like MySQL, PostgreSQL and any one of
- NoSQL-based databases such as Cassandra, Elasticsearch etc.
- Hands on experience on frameworks like Spark RDD, DataFrame, Dataset
- Experience on development of ETL for data product
- Candidate should have working knowledge on performance optimization, optimal resource utilization, Parallelism and tuning of spark jobs
- Working knowledge on file formats: CSV, JSON, XML, PARQUET, ORC, AVRO
- Good to have working knowledge with any one of the Analytical Databases like Druid, MongoDB, Apache Hive etc.
- Experience to handle real-time data feeds (good to have working knowledge on Apache Kafka or similar tool)
- Python and Scala (Optional), Spark / PySpark, Parallel programming
Data Scientist - Applied AI
Who we are?
Searce is a niche’ Cloud Consulting business with futuristic tech DNA. We do new-age tech to realise the “Next” in the “Now” for our Clients. We specialise in Cloud Data Engineering, AI/Machine Learning and Advanced Cloud infra tech such as Anthos and Kubernetes. We are one of the top & the fastest growing partners for Google Cloud and AWS globally with over 2,500 clients successfully moved to cloud.
What do we believe?
- Best practices are overrated
- Implementing best practices can only make one an average .
- Honesty and Transparency
- We believe in naked truth. We do what we tell and tell what we do.
- Client Partnership
- Client - Vendor relationship: No. We partner with clients instead.
- And our sales team comprises 100% of our clients.
How do we work ?
It’s all about being Happier first. And rest follows. Searce work culture is defined by HAPPIER.
- Humble: Happy people don’t carry ego around. We listen to understand; not to respond.
- Adaptable: We are comfortable with uncertainty. And we accept changes well. As that’s what life's about.
- Positive: We are super positive about work & life in general. We love to forget and forgive. We don’t hold grudges. We don’t have time or adequate space for it.
- Passionate: We are as passionate about the great street-food vendor across the street as about Tesla’s new model and so on. Passion is what drives us to work and makes us deliver the quality we deliver.
- Innovative: Innovate or Die. We love to challenge the status quo.
- Experimental: We encourage curiosity & making mistakes.
- Responsible: Driven. Self motivated. Self governing teams. We own it.
So, what are we hunting for ?
As a Data Scientist, you will help develop and enhance the algorithms and technology that powers our unique system. This role covers a wide range of challenges,
from developing new models using pre-existing components to enable current systems to
be more intelligent. You should be able to train models using existing data and use them in
most creative manner to deliver the smartest experience to customers. You will have to
develop multiple AI applications that push the threshold of intelligence in machines.
Working on multiple projects at a time, you will have to maintain a consistently high level of attention to detail while finding creative ways to provide analytical insights. You will also have to thrive in a fast, high-energy environment and should be able to balance multiple projects in real-time. The thrill of the next big challenge should drive you, and when faced with an obstacle, you should be able to find clever solutions.You must have the ability and interest to work on a range of different types of projects and business processes, and must have a background that demonstrates this ability.
Your bucket of Undertakings :
- Collaborate with team members to develop new models to be used for classification problems
- Work on software profiling, performance tuning and analysis, and other general software engineering tasks
- Use independent judgment to take existing data and build new models from it
- Collaborate and provide technical guidance and come up with new ideas,rapid prototyping and converting prototypes into scalable products
- Conduct experiments to assess the accuracy and recall of language processing modules and to study the effect of such experiences
- Lead AI R&D initiatives to include prototypes and minimum viable products
- Work closely with multiple teams on projects like Visual quality inspection, ML Ops, Conversational banking, Demand forecasting, Anomaly detection etc.
- Build reusable and scalable solutions for use across the customer base
- Prototype and demonstrate AI related products and solutions for customers
- Assist business development teams in the expansion and enhancement of a pipeline to support short- and long-range growth plans
- Identify new business opportunities and prioritize pursuits for AI
- Participate in long range strategic planning activities designed to meet the Company’s objectives and to increase its enterprise value and revenue goals
Education & Experience :
- BE/B.Tech/Masters in a quantitative field such as CS, EE, Information sciences, Statistics, Mathematics, Economics, Operations Research, or related, with focus on applied and foundational Machine Learning , AI , NLP and/or / data-driven statistical analysis & modelling
- 3+ years of Experience majorly in applying AI/ML/ NLP / deep learning / data-driven statistical analysis & modelling solutions to multiple domains, including financial engineering, financial processes a plus
- Strong, proven programming skills and with machine learning and deep learning and Big data frameworks including TensorFlow, Caffe, Spark, Hadoop. Experience with writing complex programs and implementing custom algorithms in these and other environments
- Experience beyond using open source tools as-is, and writing custom code on top of, or in addition to, existing open source frameworks
- Proven capability in demonstrating successful advanced technology solutions (either prototypes, POCs, well-cited research publications, and/or products) using ML/AI/NLP/data science in one or more domains
- Research and implement novel machine learning and statistical approaches
- Experience in data management, data analytics middleware, platforms and infrastructure, cloud and fog computing is a plus
- Excellent communication skills (oral and written) to explain complex algorithms, solutions to stakeholders across multiple disciplines, and ability to work in a diverse team
- Extensive experience with Hadoop and Machine learning algorithms
- Exposure to Deep Learning, Neural Networks, or related fields and a strong interest and desire to pursue them
- Experience in Natural Language Processing, Computer Vision, Machine Learning or Machine Intelligence (Artificial Intelligence)
- Passion for solving NLP problems
- Experience with specialized tools and project for working with natural language processing
- Knowledge of machine learning frameworks like Tensorflow, Pytorch
- Experience with software version control systems like Github
- Fast learner and be able to work independently as well as in a team environment with good written and verbal communication skills
We are a young, fast-growing AI company shaking up how work gets done across the enterprise. Every day, we help clients identify opportunities for automation, and then use a variety of AI and advanced automation techniques to rapidly model manual work in the form of code. Our impact has already been felt across some of the most reputable Fortune 500 companies, who are consequently seeing major gains in efficiency, client satisfaction, and overall savings. It’s an exciting experience to watch companies transform themselves rapidly with Soroco!
Based across US, UK, and India, our team includes several PhDs and graduates from top-notch universities such as MIT, Harvard, Carnegie Mellon, Dartmouth, and top rankers/medalists from the IITs and NITs. The senior leadership includes a former founder of a VC/hedge fund, a computer scientist from Harvard, and a former founder of a successful digital media firm. Our team has collectively published more than 100 papers in international journals and conferences and been granted over 20 patents. Our board members include some of the most well-known entrepreneurs across the globe, and our early clients include some of the most innovative Fortune 100 companies.
As an individual contributor role, Business Analyst (BA) will work closely with Data Science Manager in India. BAs will be primarily responsible for analyzing improvement opportunities with business process, people productivity, application usage experience and other advanced analytics projects using Soroco scout platform collected data, for clients from diverse industry.
Responsibilities include (but are not limited to):
- Understanding project objectives and frame analytics approach to provide the solution.
- Take ownership in extracting, cleansing, structuring & analyzing data
- Analyze data using statistical or rule-based techniques to identify actionable insights.
- Prepare PowerPoint presentation/build visualization solutions for presenting the analysis & actionable insights to client.
- Brainstorm and perform root cause analysis to provide suggestions to improve scout platform.
- Work closely with product managers to build analytical features in the product.
- Manage multiple projects simultaneously, in a fast-paced setting
- Communicate effectively with client engagement, product, and engineering teams
An ideal BA should be passionate and entrepreneurial in nature, with a flexible attitude to learn anything and a willingness to provide the highest level of professional service.
- 2-4 years of analytics work experience with a University degree in Engineering, preferably from Tier-1 or Tier-2 colleges.
- Possess the skill to creatively solve analytical problems and propose solutions.
- Ability to perform data manipulation and data modeling with complex data using SQL/Python
- Knowledge of statistics and experience using statistical packages for analyzing datasets (R/Python)
- Proficiency in Microsoft Office Excel and PowerPoint.
- Impeccable attention to detail with excellent prioritization skills
- Effective verbal, written and interpersonal communication skills.
- Must be a team player and able to build strong working relationships with stakeholders
- Strong capabilities and experience with programming in Python (Numpy & Pandas)
- Knowledge of machine learning techniques (clustering, classification, and sequencing, among others)
- Experience with visualization tools like Tableau, PowerBI, Qlik.
How You Will Grow:
Soroco believes in supporting you and your career. We will encourage you to grow by providing you with professional development opportunities across multiple business functions. Joining a young company will allow you to explore what is possible and have a high impact
- Installing and configuring Informatica components, including high availability; managing server activations and de-activations for all environments; ensuring that all systems and procedures adhere to organizational best practices
- Day to day administration of the Informatica Suite of services (PowerCenter, IDS, Metadata, Glossary and Analyst).
- Informatica capacity planning and on-going monitoring (e.g. CPU, Memory, etc.) to proactively increase capacity as needed.
- Manage backup and security of Data Integration Infrastructure.
- Design, develop, and maintain all data warehouse, data marts, and ETL functions for the organization as a part of an infrastructure team.
- Consult with users, management, vendors, and technicians to assess computing needs and system requirements.
- Develop and interpret organizational goals, policies, and procedures.
- Evaluate the organization's technology use and needs and recommend improvements, such as software upgrades.
- Prepare and review operational reports or project progress reports.
- Assist in the daily operations of the Architecture Team , analyzing workflow, establishing priorities, developing standards, and setting deadlines.
- Work with vendors to manage support SLA’s and influence vendor product roadmap
- Provide leadership and guidance in technical meetings, define standards and assist/provide status updates
- Work with cross functional operations teams such as systems, storage and network to design technology stacks.
- Minimum 6+ years’ experience as Informatica Engineer and Developer role
- Minimum of 5+ years’ experience in an ETL environment as a developer.
- Minimum of 5+ years of experience in SQL coding and understanding of databases
- Proficiency in Python
- Proficiency in command line troubleshooting
- Proficiency in writing code in Perl/Shell scripting languages
- Understanding of Java and concepts of Object-oriented programming
- Good understanding of systems, networking, and storage
- Strong knowledge of scalability and high availability
Indium Software is a niche technology solutions company with deep expertise in Digital , QA and Gaming. Indium helps customers in their Digital Transformation journey through a gamut of solutions that enhance business value.
With over 1000+ associates globally, Indium operates through offices in the US, UK and India
Visit www.indiumsoftware.com to know more.
Job Title: Analytics Data Engineer
What will you do:
The Data Engineer must be an expert in SQL development further providing support to the Data and Analytics in database design, data flow and analysis activities. The position of the Data Engineer also plays a key role in the development and deployment of innovative big data platforms for advanced analytics and data processing. The Data Engineer defines and builds the data pipelines that will enable faster, better, data-informed decision-making within the business.
Extensive Experience with SQL and strong ability to process and analyse complex data
The candidate should also have an ability to design, build, and maintain the business’s ETL pipeline and data warehouse The candidate will also demonstrate expertise in data modelling and query performance tuning on SQL Server
Proficiency with analytics experience, especially funnel analysis, and have worked on analytical tools like Mixpanel, Amplitude, Thoughtspot, Google Analytics, and similar tools.
Should work on tools and frameworks required for building efficient and scalable data pipelines
Excellent at communicating and articulating ideas and an ability to influence others as well as drive towards a better solution continuously.
Experience working in python, Hive queries, spark, pysaprk, sparkSQL, presto
- Relate Metrics to product
- Programmatic Thinking
- Edge cases
- Good Communication
- Product functionality understanding
Perks & Benefits:
A dynamic, creative & intelligent team they will make you love being at work.
Autonomous and hands-on role to make an impact you will be joining at an exciting time of growth!
Flexible work hours and Attractive pay package and perks
An inclusive work environment that lets you work in the way that works best for you!
2. Should understand the importance and know-how of taking the machine-learning-based solution to the consumer.
3. Hands-on experience with statistical, machine-learning tools and techniques
4. Good exposure to Deep learning libraries like Tensorflow, PyTorch.
5. Experience in implementing Deep Learning techniques, Computer Vision and NLP. The candidate should be able to develop the solution from scratch with Github codes exposed.
6. Should be able to read research papers and pick ideas to quickly reproduce research in the most comfortable Deep Learning library.
7. Should be strong in data structures and algorithms. Should be able to do code complexity analysis/optimization for smooth delivery to production.
8. Expert level coding experience in Python.
9. Technologies: Backend - Python (Programming Language)
10. Should have the ability to think long term solutions, modularity, and reusability of the components.
11. Should be able to work in a collaborative way. Should be open to learning from peers as well as constantly bring new ideas to the table.
12. Self-driven missile. Open to peer criticism, feedback and should be able to take it positively. Ready to be held accountable for the responsibilities undertaken.