- BE Computer Science, MCA or equivalent
- Javascript AI library experience - there are many!
- A team player who can collaborate with engineers, designers, and other cross-functional
teams
- Ability to initiate and drive projects to completion with minimal guidance
- Fluent in and passionate about JavaScript
- Troubleshooting/debugging experience
- Strong communication skills
Experience:
- Min 1 year experience
- Not more than 7 year experience.
- Startup experience is a must.
Location ● Remotely, anywhere in India
Timings:- 40 hours a week but with 4 hours a day overlapping with client timezone. Typically clients are in California PST Timezone.
Position:- Full time/Direct
Other Benefits
- We have great benefits such as PF, medical insurance, 12 annual company holidays, 12
PTO leaves per year, annual increments, Diwali bonus, spot bonuses and other
incentives etc.
- We dont believe in locking in people with large notice periods. You will stay here
because you love the company. We have only a 15 days notice period.
About A firm which works with US Clients. Permanent WFH.
Similar jobs
Data driven decision-making is core to advertising technology at AdElement. We are looking for sharp, disciplined, and highly quantitative machine learning/ artificial intellignce engineers with big data experience and a passion for digital marketing to help drive informed decision-making. You will work with top-talent and cutting edge technology and have a unique opportunity to turn your insights into products influencing billions. The potential candidate will have an extensive background in distributed training frameworks, will have experience to deploy related machine learning models end to end, and will have some experience in data-driven decision making of machine learning infrastructure enhancement. This is your chance to leave your legacy and be part of a highly successful and growing company.
Required Skills
- 3+ years of industry experience with Java/ Python in a programming intensive role
- 3+ years of experience with one or more of the following machine learning topics: classification, clustering, optimization, recommendation system, graph mining, deep learning
- 3+ years of industry experience with distributed computing frameworks such as Hadoop/Spark, Kubernetes ecosystem, etc
- 3+ years of industry experience with popular deep learning frameworks such as Spark MLlib, Keras, Tensorflow, PyTorch, etc
- 3+ years of industry experience with major cloud computing services
- An effective communicator with the ability to explain technical concepts to a non-technical audience
- (Preferred) Prior experience with ads product development (e.g., DSP/ad-exchange/SSP)
- Able to lead a small team of AI/ML Engineers to achieve business objectives
Responsibilities
- Collaborate across multiple teams - Data Science, Operations & Engineering on unique machine learning system challenges at scale
- Leverage distributed training systems to build scalable machine learning pipelines including ETL, model training and deployments in Real-Time Bidding space.
- Design and implement solutions to optimize distributed training execution in terms of model hyperparameter optimization, model training/inference latency and system-level bottlenecks
- Research state-of-the-art machine learning infrastructures to improve data healthiness, model quality and state management during the lifecycle of ML models refresh.
- Optimize integration between popular machine learning libraries and cloud ML and data processing frameworks.
- Build Deep Learning models and algorithms with optimal parallelism and performance on CPUs/ GPUs.
- Work with top management on defining teams goals and objectives.
Education
- MTech or Ph.D. in Computer Science, Software Engineering, Mathematics or related fields
The Platform Data Science team works at the intersection of data science and engineering. Domain experts develop and advance platforms, including the data platforms, machine learning platform, other platforms for Forecasting, Experimentation, Anomaly Detection, Conversational AI, Underwriting of Risk, Portfolio Management, Fraud Detection & Prevention and many more. We also are the Data Science and Analytics partners for Product and provide Behavioural Science insights across Jupiter.
About the role:
We’re looking for strong Software Engineers that can combine EMR, Redshift, Hadoop, Spark, Kafka, Elastic Search, Tensorflow, Pytorch and other technologies to build the next generation Data Platform, ML Platform, Experimentation Platform. If this sounds interesting we’d love to hear from you!
This role will involve designing and developing software products that impact many areas of our business. The individual in this role will have responsibility help define requirements, create software designs, implement code to these specifications, provide thorough unit and integration testing, and support products while deployed and used by our stakeholders.
Key Responsibilities:
Participate, Own & Influence in architecting & designing of systems
Collaborate with other engineers, data scientists, product managers
Build intelligent systems that drive decisions
Build systems that enable us to perform experiments and iterate quickly
Build platforms that enable scientists to train, deploy and monitor models at scale
Build analytical systems that drives better decision making
Required Skills:
Programming experience with at least one modern language such as Java, Scala including object-oriented design
Experience in contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems
Bachelor’s degree in Computer Science or related field
Computer Science fundamentals in object-oriented design
Computer Science fundamentals in data structures
Computer Science fundamentals in algorithm design, problem solving, and complexity analysis
Experience in databases, analytics, big data systems or business intelligence products:
Data lake, data warehouse, ETL, ML platform
Big data tech like: Hadoop, Apache Spark
About Thriving Springs:
Have you ever wondered, what does it take to succeed at the workplace ? Is it only technical skills or do behavior skills also play a role ?
Research suggests that 85% of job success depends on soft and behavioral skills and this is where Thriving Springs plays a critical role by helping organizations, teams and individuals build these skills and unlock their highest potential and achieve an all round success. It does this by assessing current levels of soft and behavioral skills and then helping in growing them through Thriving Springs’s innovative smart platform that combines the power of emotional intelligence (EI) with artificial intelligence (AI) and machine learning (ML).
Why join Thriving Springs?
While there are many companies, however, companies that are built with a clear purpose and mission to take humanity forward are rare. Thriving Springs - An Edtech startup is one of the pioneers in driving human productivity, fulfillment and success forward through empowering millions of working professionals globally with soft and behavioral skills and assisting them in the flow of work. Thriving Springs is leading a worldwide and purpose-led movement to drive success at the workplace and we invite you to join this fun and adventurous journey with us.
Culture, Rewards and Benefits:
A great culture is created when everybody in the company has great opportunities, creates meaningful impact and contributes to the good of society - Larry Page - Founder - Google
The founders of Thriving Springs come from Google and have built a Google-like culture within Thriving Springs where every team member experiences freedom and autonomy to pursue their ideas that have a deep impact and will shape the future of workplace learning. The culture at Thriving Springs is deeply rooted in emotional intelligence where there is an emphasis on collaboration, empathy and winning together as one team.
The candidate will receive an attractive total compensation package including a competitive fixed salary, performance bonuses and ESOPs translating into non-incremental gains as the company grows into a potential unicorn. In addition, every member will receive medical and health insurance.
Location: Hyderabad, India
Qualifications:
BTech/BE in computer science, electrical, electronics or related fields 5+ years of full stack design and development experience High emotional intelligence, empathy and collaborative approach. Experience in Angular Javascript frameworks, CSS, HTML5, NodeJS, ExpressJS, MongoDB to handle full stack web development. Experience with developing rich dynamic front end applications using Angular and CSS frameworks like BulmaCSS, Angular Material, Bootstrap, etc. Knowledge of GraphQL would be an added advantage. Knowledge of Cloud services like AWS, Heroku, Azure is preferable. Should be a quick learner to keep up with the pace of the ever changing world of technology as the candidate will get excellent exposure to the latest and trending Cloud based Saas technologies and best practices while working with varied customers based across the globe.
Responsibilities:
Develop web applications covering end to end software development life cycle right from writing UI code using Angular to backend API code using NodeJS and managing databases like MongoDB, MySQL, etc. Involved in full stack code management from Git check-ins to running automated builds and deployments using DevOps practices to deploy to public cloud services like AWS, Azure, Heroku, etc. Handling full-stack web development workflow right from front end to backend to CI/CD workflows. Design and Develop the tech architecture and work closely with CEO and CTO of the company Drive and guide with work of other engineers on the team
This is a leadership role and candidate is expected to wear multiple technical hats including customer interactions and investor discussions
- Perform research and development on Machine Learning specifically in the areas of Speech Recognition, Digital signal processing, audio signal processing, NaturalLanguage processing, Natural Language Understanding
- Read and keep up with the research in Speech recognition, Machine Learning, Deep
- Understand and implement research papers to the business problem and build the
- Contribute to applied research and open source community
- Mentor and guide team members
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
We have an Excellent job Opportunity for "Applied Machine Learning Engineer" with one of thr Product based organization for Remote Working Mode or for Mumbai Location.
Job Responsibilities:
- Apply your knowledge of ML and statistics to conceptualise, experiment, develop & deploy machine learning & deep learning systems.
- Understanding the business objectives & defining the right target metrics to track performance & progress.
- Defining & building datasets with the appropriate representation techniques for learning.
- Training & tuning models. Running evaluation & test experiments on the models.
- Build ML pipelines end to end. (Everything MLOps.)
- Building pipelines for the various stages.
- Deploying models.
- Troubleshooting issues with models in production.
- Reporting results of model performance in production.
- Retraining, performance logging & maintenance.
- Help the business with insights for better decision-making. You will build many predictive models for internal business operations
you will derive insights from the trained models & data to help the product & business teams make better decisions.
Requirements:
- 2+ years of work experience as an ML engineer or Data Scientist with a Bachelors Degree in Computer science or related field
- Theoretical & practical knowledge of Machine Learning, Deep Learning and Statistical methods. (NLP Tasks, Recommender Systems, Predictive Modelling etc)
- Since Pepper is a content company, you will work on many interesting text based problems. Solid understanding of Natural Language Processing techniques with Deep Learning is a must for this role.
- Familiarity with the popular NLP applications and text representation architectures & techniques: text classification, machine translation, named entity recognition, summarisation, question answering, zero-shot learning etc. Bag of Words, TF-IDF, Word2vec, GloVe, BERT, ELMo, GPT etc.
- Experience with ML frameworks (like Tensorflow, Keras, PyTorch) & libraries like Sklearn.
- Experience with ML infrastructure & shipping models.
- Excellent programming & algorithmic skills. Good understanding of Data Structures and algorithms (fluent in at least one object oriented programming language). Proficiency in Python is a must.
- Strong understanding of database systems & schema design. Proficient in SQL
Please let us know if you are interested in the above opening and if interested please let us know your
Current CTC :
Expected CTC :
Notice Period :
Relevant experience in Machine Learning :
Relevant experience in Deep Learning:
Relevant experience in NLP Applications:
Regards
Ashwini
serving its customers with one of a kind enterprise HR Chatbot experience. LeenaAI has raised over $35 Mn from top Silicon Valley investors like Bessemer, Greycroft,
YCombinator, Elad Gil etc. Leena is being used by over 2000,000 employees globally in
companies like Coca-Cola, Sony, Tata Technologies, Marico, Vodafone, Cipla etc. and are
growing exponentially fast.
Job Description:
1. Build & Design conversational chatbots that can be directly deployed
for usage by a large volume of customers.
2. Build a training data for our machine learning engine to understand user queries.
3. Build FAQs from company policies.
4. Once system is live, need to debug the negative cases and correct it in the training data
built.
5. Manage End to End lifecycle of the data in the system till it achieves more than 90%
accuracy.
6. Build a master dataset of all the queries that comes into the system.
Requirements :
1. Adequate verbal communication skills in English. Ability to articulate
the solutions in simple & clear language.
2. Good analytical skills
3. Experience in handling customer queries/worked in chat support
4. Logical knowledge to make control flows.
5. Hands on experience in Microsoft Excel, Microsoft Word
Good to have:
1. Knowledge/Experience in building Chatbots using publicly available bot
platforms like Dialogflow, Microsoft Bot Framework, etc
This position is not for freshers. We are looking for candidates with AI/ML/CV experience of at least 4 year in the industry.