CommerceIQ is Hiring Data Scientist (3-5 yrs)
At CommerceIQ, we are building the world’s most sophisticated E-commerce Channel Optimization software to help brands leverage Machine Learning, Analytics and Automation to grow their E-commerce business on all channels, globally.
Using CommerceIQ as a single source of truth, customers have driven 40% increase in incremental sales, 20% improvement in profitability and 32% reduction in out of stock rates on Amazon.
What You’ll Be Doing
As a Senior Data Scientist, you will work closely with Engineering/Product/Operations teams to build state-of-the-art ML based solutions for B2B SaaS products. This entails not only leveraging advanced techniques for predictions, time-series forecasting, topic modelling, optimisation but deep understanding of business and product too.
- Apply excellent problem solving skills to deconstruct and formulate solutions from first-principles
- Work on data science roadmap and build the core engine of our flagship CommerceIQ product
- Collaborate with product and engineering to design product strategy, identify key metrics to drive and support with proof of concept
- Perform rapid prototyping of experimental solutions and develop robust, sustainable and scalable production systems
- Work with large scale ecommerce data of the biggest brands on amazon
- Apply out-of-the-box, advanced algorithms to complex problems in real-time systems
- Drive productization of techniques to be made available to a wide range of customers
- You would be working with and mentoring fellow team members on the owned charter
What we are looking for -
- Bachelor’s or Masters in Computer Science or Maths/Stats from a reputed college with 4+ years of experience in solving data science problems that have driven value to customers
- Good depth and breadth in machine learning (theory and practice), optimization methods, data mining, statistics and linear algebra. Experience in NLP would be an advantage
- Hands-on programming skills and ability to write modular and scalable code in Python/R. Knowledge of SQL is required
- Familiarity with distributed computing architecture like Spark, Map-Reduce paradigm and Hadoop will be an added advantage
- Strong spoken and written communication skills, able to explain complex ideas in a simple, intuitive manner, write/maintain good technical documentation on projects
- Experience with building ML data products in an engineering organization interfacing with other teams and departments to deliver impact
- We are looking for candidates who are curious and self-starters; obsess over customer problems to deliver maximum value to them.
- Data scientist, Machine Learning, data science, data analyst
Job Type: Full-time
- Data Scientist: 3 years (Required)
- Looking for product based industry experience from tier 1 /tier 2 colleges (NIT ,BIT, IIT,IIIT, BITS, Strong Profiles)
We are looking for candidates who have demonstrated both a strong business sense and deep understanding of the quantitative foundations of modelling.
• Excellent analytical and problem-solving skills, including the ability to disaggregate issues, identify root causes and recommend solutions
• Statistical programming software experience in SPSS and comfortable working with large data sets.
• R, Python, SAS & SQL are preferred but not a mandate
• Excellent time management skills
• Good written and verbal communication skills; understanding of both written and spoken English
• Strong interpersonal skills
• Ability to act autonomously, bringing structure and organization to work
• Creative and action-oriented mindset
• Ability to interact in a fluid, demanding and unstructured environment where priorities evolve constantly, and methodologies are regularly challenged
• Ability to work under pressure and deliver on tight deadlines
Qualifications and Experience:
• Graduate degree in: Statistics/Economics/Econometrics/Computer
Science/Engineering/Mathematics/MBA (with a strong quantitative background) or
• Strong track record work experience in the field of business intelligence, market
research, and/or Advanced Analytics
• Knowledge of data collection methods (focus groups, surveys, etc.)
• Knowledge of statistical packages (SPSS, SAS, R, Python, or similar), databases,
and MS Office (Excel, PowerPoint, Word)
• Strong analytical and critical thinking skills
• Industry experience in Consumer Experience/Healthcare a plus
- Role: Machine Learning Lead
- Experience: 5+ Years
- Employee strength: 80+
- Remuneration: Most competitive in the market
• Advance knowledge of Python.
• Object Oriented Programming skills.
• Mathematical understanding of machine learning and deep learning algorithms.
• Thorough grasp on statistical terminologies.
• Libraries: Tensorflow, Keras, Pytorch, Statsmodels, Scikit-learn, SciPy, Numpy, Pandas, Matplotlib, Seaborn, Plotly
• Algorithms: Ensemble Algorithms, Artificial Neural Networks and Deep Learning, Clustering Algorithms, Decision Tree Algorithms, Dimensionality Reduction Algorithms, etc.
• MySQL, MongoDB, ElasticSearch or other NoSQL database implementations.
If interested kindly share your cv at tanya @tigihr. com
What You’ll Do:
- Accurate translation of business needs into a conceptual and technical architecture design of AI models and solutions
- Collaboration with developers and engineering teams resolving challenging tasks and ensuring proposed design is properly implemented
- Strategy for managing the changes to the AI models (new business needs, technology changes, model retraining, etc.)
- Collaborate with business partners and clients for AI solutioning and use cases. Provide recommendations to drive alignment with business teams
- Define and implement evaluation strategies for each model, demonstrate applicability and performance of the model, and identify its limits
- Design complex system integrations of AI technologies with API-driven platforms, using best practices for security and performance
- Experience in languages, tools & technologies such as Python, Tensorflow, Pytorch, Kubernetes, Docker, etc
- Experience with MLOps tools (like TFx, Tensorflow Serving, KubeFlow, etc.) and methodologies for CI/CD of ML models
- Proactively identify and address technical strengths, weaknesses, and opportunities across the AI and ML domain
- Strategic direction for maximizing simplification and re-use lowering overall TCO
What You’ll Bring:
- Minimum 10 years of hands-on experience in the IT field, at least 6+ years in Data Science/ ML / AI implementation based Products and Solutions
- Experience with Computer Vision - Vision AI & Document AI
- Must be hands-on with Python programming language, MLOps, Tensorflow, Pytorch, Keras, Scikit, etc.,
- Well versed with deep learning concepts, computer vision, image processing, document processing, convolutional neural networks and data ontology applications
- Proven track record at execution of projects in agile & cross-functional teams
- Published research papers and represented in reputable AI conferences and the ability to lead and drive research mindset across the team
- Good to have experience with GCP / Microsoft Azure / Amazon Web Services
- Ph.D. or Masters in a quantitative field such as Computer Science, IT, Stats/Maths
What we offer:
- Group Medical Insurance (Family Floater Plan - Self + Spouse + 2 Dependent Children)
- Sum Insured: INR 5,00,000/-
- Maternity cover upto two children
- Inclusive of COVID-19 Coverage
- Cashless & Reimbursement facility
- Access to free online doctor consultation
- Personal Accident Policy (Disability Insurance) -
- Sum Insured: INR. 25,00,000/- Per Employee
- Accidental Death and Permanent Total Disability is covered up to 100% of Sum Insured
- Permanent Partial Disability is covered as per the scale of benefits decided by the Insurer
- Temporary Total Disability is covered
- An option of Paytm Food Wallet (up to Rs. 2500) as a tax saver benefit
- Monthly Internet Reimbursement of upto Rs. 1,000
- Opportunity to pursue Executive Programs/ courses at top universities globally
- Professional Development opportunities through various MTX sponsored certifications on multiple technology stacks including Google Cloud, Amazon & others.
Azure – Data Engineer
- At least 2 years hands on experience working with an Agile data engineering team working on big data pipelines using Azure in a commercial environment.
- Dealing with senior stakeholders/leadership
- Understanding of Azure data security and encryption best practices. [ADFS/ACLs]
Data Bricks –experience writing in and using data bricks Using Python to transform, manipulate data.
Data Factory – experience using data factory in an enterprise solution to build data pipelines. Experience calling rest APIs.
Synapse/data warehouse – experience using synapse/data warehouse to present data securely and to build & manage data models.
Microsoft SQL server – We’d expect the candidate to have come from a SQL/Data background and progressed into Azure
PowerBI – Experience with this is preferred
- Experience using GIT as a source control system
- Understanding of DevOps concepts and application
- Understanding of Azure Cloud costs/management and running platforms efficiently
|Job Title: Data Engineer|
|Tech Job Family: DACI|
|• Bachelor's Degree in Engineering, Computer Science, CIS, or related field (or equivalent work experience in a related field)|
|• 2 years of experience in Data, BI or Platform Engineering, Data Warehousing/ETL, or Software Engineering|
|• 1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)|
|• Master's Degree in Computer Science, CIS, or related field|
|• 2 years of IT experience developing and implementing business systems within an organization|
|• 4 years of experience working with defect or incident tracking software|
|• 4 years of experience with technical documentation in a software development environment|
|• 2 years of experience working with an IT Infrastructure Library (ITIL) framework|
|• 2 years of experience leading teams, with or without direct reports|
|• Experience with application and integration middleware|
|• Experience with database technologies|
|• 2 years of experience in Hadoop or any Cloud Bigdata components (specific to the Data Engineering role)|
|• Expertise in Java/Scala/Python, SQL, Scripting, Teradata, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark (Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components (specific to the Data Engineering role)|
|• Expertise in MicroStrategy/Power BI/SQL, Scripting, Teradata or equivalent RDBMS, Hadoop (OLAP on Hadoop), Dashboard development, Mobile development (specific to the BI Engineering role)|
|• 2 years of experience in Hadoop, NO-SQL, RDBMS or any Cloud Bigdata components, Teradata, MicroStrategy (specific to the Platform Engineering role)|
|• Expertise in Python, SQL, Scripting, Teradata, Hadoop utilities like Sqoop, Hive, Pig, Map Reduce, Spark, Ambari, Ranger, Kafka or equivalent Cloud Bigdata components (specific to the Platform Engineering role)|
|Lowe’s is an equal opportunity employer and administers all personnel practices without regard to race, color, religion, sex, age, national origin, disability, sexual orientation, gender identity or expression, marital status, veteran status, genetics or any other category protected under applicable law.|
at A firm which woks with US clients. Permanent WFH.
This person MUST have:
- B.E Computer Science or equivalent
- 5 years experience with the Django framework
- Experience with building APIs (REST or GraphQL)
- Strong Troubleshooting and debugging skills
- React.js knowledge would be an added bonus
- Understanding on how to use a database like Postgres (prefered choice), SQLite, MongoDB, MySQL.
- Sound knowledge of object-oriented design and analysis.
- A strong passion for writing simple, clean and efficient code.
- Proficient understanding of code versioning tools Git.
- Strong communication skills.
- Min 5 year experience
- Startup experience is a must.
- Remote developer
- 40 hours a week but with 4 hours a day overlapping with client timezone. Typically clients are in California PST Timezone.
- Full time/Direct
- We have great benefits such as PF, medical insurance, 12 annual company holidays, 12 PTO leaves per year, annual increments, Diwali bonus, spot bonuses and other incentives etc.
- We dont believe in locking in people with large notice periods. You will stay here because you love the company. We have only a 15 days notice period.
- Design, implement and support an analytical data infrastructure, providing ad hoc access to large data sets and computing power.
- Contribute to development of standards and the design and implementation of proactive processes to collect and report data and statistics on assigned systems.
- Research opportunities for data acquisition and new uses for existing data.
- Provide technical development expertise for designing, coding, testing, debugging, documenting and supporting data solutions.
- Experience building data pipelines to connect analytics stacks, client data visualization tools and external data sources.
- Experience with cloud and distributed systems principles
- Experience with Azure/AWS/GCP cloud infrastructure
- Experience with Databricks Clusters and Configuration
- Experience with Python, R, sh/bash and JVM-based languages including Scala and Java.
- Experience with Hadoop family languages including Pig and Hive.
Data Analyst Job Duties
Data analyst responsibilities include conducting full lifecycle analysis to include requirements, activities and design. Data analysts will develop analysis and reporting capabilities. They will also monitor performance and quality control plans to identify improvements.
Interpret data, analyze results using statistical techniques and provide ongoing reports
Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
Acquire data from primary or secondary data sources and maintain databases/data systems
Identify, analyze, and interpret trends or patterns in complex data sets
Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems
Work with management to prioritize business and information needs
Locate and define new process improvement opportunities
Proven working experience as a Data Analyst or Business Data Analyst
Technical expertise regarding data models, database design development, data mining and segmentation techniques
Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS etc)
Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
Adept at queries, report writing and presenting findings
BS in Mathematics, Economics, Computer Science, Information Management or Statistics