Senior Backend Engineer
at a secure data and intelligence sharing platform for Enterprises. We believe data security and privacy are paramount for AI and Machine Learning to truly evolve and embed into the world
Good experience with writing quality and mature Python code. Familiar with Python
design patterns. OOP , refactoring patterns, writing async tasks and heavy
Understand auth n/z, ideally worked on authorization/authentication mechanism in
python. Familiarity with Auth0 is preferred.
Understand how to secure API endpoints.
Familiar with AWS concepts on -> EC2, VPC, RDS, and IAM. (Or any cloud
Backend Engineer @Eder Labs 3
Have basic DevOps experience and engineering and supporting services in modern
containerized cloud stack.
Experience and understanding of docker an docker-compose.
Own backend design, architecture, implementation and delivery of features and
Take ownership of the Database. Write migrations, maintain, and manage
Database. (Postgres, MongoDB.)
Collaborate with a generalist team to develop, test and launch new features. Be a
generalist and find ways and functions in to bring up your team, product and
eventually the business.
Refactoring when needed, and keep hunting for new tools that can help us as a
business (not just the engineering team)
Develop Data Pipelines, from data sourcing, wrangling (cleaning), transformations,
to eventual use
Develop MLOps systems, to take in data, analyze it, pass it through any models,
and process results. DevOps for Machine Learning.
Follow modern git oriented dev workflows, versioning, CI/CD automation and
Ideal Candidate will have :
2 years of full time experience working as a data infrastructure / core backend
engineer in a team environment.
Understanding of Machine Learning technologies, frameworks and paradigms
Backend Engineer @Eder Labs 4
Experience with the following tools:
Fast API / Django
Kafka / RabbitMQ
Tensorflow / Pandas / Jupyter Notebook
pytest / asyncio
Experience setting up and managing ELK stack
In depth understanding of database systems, in terms of scaling compute efficiently.
Good understanding of data streaming services, and the involved networking.
Power BI Developer(Azure Developer )
Senior visualization engineer with understanding in Azure Data Factory & Databricks to develop and deliver solutions that enable delivery of information to audiences in support of key business processes.
Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business and technical counterparts.
- Strong designing concepts of data visualization centered on business user and a knack of communicating insights visually.
- Ability to produce any of the charting methods available with drill down options and action-based reporting. This includes use of right graphs for the underlying data with company themes and objects.
- Publishing reports & dashboards on reporting server and providing role-based access to users.
- Ability to create wireframes on any tool for communicating the reporting design.
- Creation of ad-hoc reports & dashboards to visually communicate data hub metrics (metadata information) for top management understanding.
- Should be able to handle huge volume of data from databases such as SQL Server, Synapse, Delta Lake or flat files and create high performance dashboards.
- Should be good in Power BI development
- Expertise in 2 or more BI (Visualization) tools in building reports and dashboards.
- Understanding of Azure components like Azure Data Factory, Data lake Store, SQL Database, Azure Databricks
- Strong knowledge in SQL queries
- Must have worked in full life-cycle development from functional design to deployment
- Intermediate understanding to format, process and transform data
- Should have working knowledge of GIT, SVN
- Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.
- Basic understanding of data modelling and ability to combine data from multiple sources to create integrated reports
- Bachelor's degree in Computer Science or Technology
- Proven success in contributing to a team-oriented environment
2-4 years of experience in developing ETL activities for Azure – Big data, relational databases, and data warehouse solutions.
Extensive hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, Azure Analysis Service, Azure Databricks, Azure Data Catalog, ML Studio, AI/ML, Snowflake, etc.
Well versed in DevOps and CI/CD deployments
Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, SSIS, etc.
Minimum of 2 years of RDBMS experience
Experience with private and public cloud architectures, pros/cons, and migration considerations.
- DevOps on an Azure platform
- Experience developing and deploying ETL solutions on Azure
- IoT, event-driven, microservices, Containers/Kubernetes in the cloud
- Familiarity with the technology stack available in the industry for metadata management: Data Governance, Data Quality, MDM, Lineage, Data Catalog etc.
- Multi-cloud experience a plus - Azure, AWS, Google
Professional Skill Requirements
Proven ability to build, manage and foster a team-oriented environment
Proven ability to work creatively and analytically in a problem-solving environment
Desire to work in an information systems environment
Excellent communication (written and oral) and interpersonal skills
Excellent leadership and management skills
Excellent organizational, multi-tasking, and time-management skills
monitoring, optimizing and extending of statistical models at scale
internal stakeholders at all functional levels
mathematics, machine learning or other technical/scientific studies
5+ years of experience in Data Science in collaboration with Data Engineer
data engineering methodologies
Machine Learning Engineer at Zocket
We are looking for a curious Machine Learning Engineer to join our extremely fast growing Tech Team at Zocket!
Zocket helps businesses create digital ads in less than 30 seconds and grow digitally without any expertise.
Currently there are only two options for an SMB owner, either employ a digital marketing agency or stay away from digital ads. True to the mission, Zocket leverages AI to simplify digital marketing for 300 million+ small businesses around the globe.
You are ideal if you have:
- Interest in working with a fast growing Start Up
- Strong communication and Presentation skills
- Ability to meet deadlines
- Critical thinking Abilities
- Interest in working in a high paced environment
- Desire for Lots and lots of learning
- Inclination towards working on diverse projects and to make real contributions to the company
- Bachelor's Degree in Computer Science or any quantitative discipline (Statistics, Mathematics, Economics)
- 3+ Years of relevant experience
- Experience working with languages like Python(mandatory) and R
- Experience working with Visualisation tools like Tableau, PowerBI.
- Experience working in Frameworks such as OpenCV, PyTorch ,Tensorflow
- Prior experience in building and deploying ML systems using AWS(EC2, Sagemaker)
- Understanding of statistical concepts
- Hands-on Computer Vision experience
- Experience in MySQL is required
- Cookie points if you have expertise with NLP
As a Data Engineer, you are a full-stack data engineer that loves solving business problems.
You work with business leads, analysts and data scientists to understand the business domain
and engage with fellow engineers to build data products that empower better decision making.
You are passionate about data quality of our business metrics and flexibility of your solution that
scales to respond to broader business questions.
If you love to solve problems using your skills, then come join the Team Searce. We have a
casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses
on productivity and creativity, and allows you to be part of a world-class team while still being
What You’ll Do
● Understand the business problem and translate these to data services and engineering
● Explore new technologies and learn new techniques to solve business problems
● Think big! and drive the strategy for better data quality for the customers
● Collaborate with many teams - engineering and business, to build better data products
What We’re Looking For
● Over 1-3 years of experience with
○ Hands-on experience of any one programming language (Python, Java, Scala)
○ Understanding of SQL is must
○ Big data (Hadoop, Hive, Yarn, Sqoop)
○ MPP platforms (Spark, Pig, Presto)
○ Data-pipeline & scheduler tool (Ozzie, Airflow, Nifi)
○ Streaming engines (Kafka, Storm, Spark Streaming)
○ Any Relational database or DW experience
○ Any ETL tool experience
● Hands-on experience in pipeline design, ETL and application development
This is the first senior person we are bringing for this role. This person will start with the training program but will go on to build a team and eventually also be responsible for the entire training program + Bootcamp.
We are looking for someone fairly senior and has experience in data + tech. At some level, we have all the technical expertise to teach you the data stack as needed. So it's not super important you know all the tools. However, having basic knowledge of the stack requirement. The training program covers 2 parts - Technology (our stack) and Process (How we work with clients). Both of which are super important.
- Full-time flexible working schedule and own end-to-end training
- Self-starter - who can communicate effectively and proactively
- Function effectively with minimal supervision.
- You can train and mentor potential 5x engineers on Data Engineering skillsets
- You can spend time on self-learning and teaching for new technology when needed
- You are an extremely proactive communicator, who understands the challenges of remote/virtual classroom training and the need to over-communicate to offset those challenges.
- Proven experience as a corporate trainer or have passion for Teaching/ Providing Training
- Expertise in Data Engineering Space and have good experience in Data Collection, Data
- Ingestion, Data Modeling, Data Transformation, and Data Visualization technologies and techniques
- Experience Training working professionals on in-demand skills like Snowflake, debt, Fivetran, google data studio, etc.
- Training/Implementation Experience using Fivetran, DBT Cloud, Heap, Segment, Airflow, Snowflake is a big plus
Advanced degree in computer science, math, statistics or a related discipline ( Must have master degree )
Extensive data modeling and data architecture skills
Programming experience in Python, R
Background in machine learning frameworks such as TensorFlow or Keras
Knowledge of Hadoop or another distributed computing systems
Experience working in an Agile environment
Advanced math skills (Linear algebra
Differential equations (ODEs and numerical)
Theory of statistics 1
Numerical analysis 1 (numerical linear algebra) and 2 (quadrature)
Intermediate analysis (point set topology)) ( important )
Strong written and verbal communications
Hands on experience on NLP and NLG
Experience in advanced statistical techniques and concepts. ( GLM/regression, Random forest, boosting, trees, text mining ) and experience with application.
- Use data to develop machine learning models that optimize decision making in Credit Risk, Fraud, Marketing, and Operations
- Implement data pipelines, new features, and algorithms that are critical to our production models
- Create scalable strategies to deploy and execute your models
- Write well designed, testable, efficient code
- Identify valuable data sources and automate collection processes.
- Undertake to preprocess of structured and unstructured data.
- Analyze large amounts of information to discover trends and patterns.
- 2+ years of experience in applied data science or engineering with a focus on machine learning
- Python expertise with good knowledge of machine learning libraries, tools, techniques, and frameworks (e.g. pandas, sklearn, xgboost, lightgbm, logistic regression, random forest classifier, gradient boosting regressor, etc)
- strong quantitative and programming skills with a product-driven sensibility
1. Expert in deep learning and machine learning techniques,
2. Extremely Good in image/video processing,
3. Have a Good understanding of Linear algebra, Optimization techniques, Statistics and pattern recognition.
Then u r the right fit for this position.
Company Profile and Job Description
AthenasOwl (AO) is our “AI for Media” solution that helps content creators and broadcasters to create and curate smarter content. We launched the product in 2017 as an AI-powered suite meant for the media and entertainment industry. Clients use AthenaOwl's context adapted technology for redesigning content, taking better targeting decisions, automating hours of post-production work and monetizing massive content libraries.
For more details visit: www.athenasowl.tv
Senior Machine Learning Engineer
4 -6 Years of experience
Mumbai (Malad W)
- Develop cutting edge machine learning solutions at scale to solve computer vision problems in the domain of media, entertainment and sports
- Collaborate with media houses and broadcasters across the globe to solve niche problems in the field of post-production, archiving and viewership
- Manage a team of highly motivated engineers to deliver high-impact solutions quickly and at scale
The ideal candidate should have:
- Strong programming skills in any one or more programming languages like Python and C/C++
- Sound fundamentals of data structures, algorithms and object-oriented programming
- Hands-on experience with any one popular deep learning framework like TensorFlow, PyTorch, etc.
- Experience in implementing Deep Learning Solutions (Computer Vision, NLP etc.)
- Ability to quickly learn and communicate the latest findings in AI research
- Creative thinking for leveraging machine learning to build end-to-end intelligent software systems
- A pleasantly forceful personality and charismatic communication style
- Someone who will raise the average effectiveness of the team and has demonstrated exceptional abilities in some area of their life. In short, we are looking for a “Difference Maker”