![Purple Hirez cover picture](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Funauth_bgs%2Fstartups.jpg&w=3840&q=80)
Purple Hirez
http://www.purplehirez.comAbout
Connect with the team
Jobs at Purple Hirez
![Enterprise AI Solutions](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fdefault_company_picture.jpg&w=256&q=75)
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fangular.png&w=32&q=75)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Freact.png&w=32&q=75)
We are looking for a experienced engineer with superb technical skills. You will primarily be responsible for architecting and building large scale systems that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions. The successful candidate will be curious, creative, ambitious, self motivated, flexible, and have a bias towards taking action. As part of the early engineering team, you will have a chance to make a measurable impact as well as having a significant amount of responsibility.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Who you are:
- Bachelors/Masters/Phd in CS or equivalent industry experience
- Expert level knowledge in Python
- Expert level of Atleast one Web Framework such As Django, Flask, FastAPI. FastAPI Preferred.
- Understand and implement Microservices, Restful APIs, distributed systems using Cloud Native Principles
- Knowledge and Experience Integrating and Contributing to Open Source Projects and Frameworks.
- Repeated Experience Building Secure, Reliable, Scalable Platforms
- Experience With Data Control Patterns And ORM Libraries
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
What you will do:
- Co-Lead Design And Architecture of our core platform components that powers AI/ML Platform
- Integrate Data Storage Solutions Including RDBMS And Key-Value Stores
- Write Reusable, Testable, And Efficient Code
- Design And Implementation Of Low-Latency, High-Availability, Scalable and Asynchronous Applications
- Integration Of User-Facing Elements Developed By Front-End Developers With Server Side Logic
- Design Internal And Customer Facing API’s
- Support Enterprise Customers During Implementation And Launch Of Service
Experience:
- Python: 5 years (Required)
- FastAPI: 2 years (Preferred)
- REST: 5 years (Required)
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fmachine_learning.png&w=32&q=75)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fdata_science.png&w=32&q=75)
Be a part of the growth story of a rapidly growing organization in AI. We are seeking a passionate Machine Learning (ML) Engineer, with a strong background in developing and deploying state-of-the-art models on Cloud. You will participate in the complete cycle of building machine learning models from conceptualization of ideas, data preparation, feature selection, training, evaluation, and productionization.
On a typical day, you might build data pipelines, develop a new machine learning algorithm, train a new model or deploy the trained model on the cloud. You will have a high degree of autonomy, ownership, and influence over your work, machine learning organizations' evolution, and the direction of the company.
Required Qualifications
- Bachelor's degree in computer science/electrical engineering or equivalent practical experience
- 7+ years of Industry experience in Data Science, ML/AI projects. Experience in productionizing machine learning in the industry setting
- Strong grasp of statistical machine learning, linear algebra, deep learning, and computer vision
- 3+ years experience with one or more general-purpose programming languages including but not limited to: R, Python.
- Experience with PyTorch or TensorFlow or other ML Frameworks.
- Experience in using Cloud services such as AWS, GCP, Azure. Understand the principles of developing cloud-native application development
In this role you will:
- Design and implement ML components, systems and tools to automate and enable our various AI industry solutions
- Apply research methodologies to identify the machine learning models to solve a business problem and deploy the model at scale.
- Own the ML pipeline from data collection, through the prototype development to production.
- Develop high-performance, scalable, and maintainable inference services that communicate with the rest of our tech stack
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fpython.png&w=32&q=75)
- 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
- Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- 5+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
Responsibilities
- Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
- Create custom Operators for Kubernetes, Kubeflow
- Develop data ingestion processes and ETLs
- Assist in dev ops operations
- Design and Implement APIs
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Must possess a fair, clear understanding of fundamentals and concepts of Magento 1/2, PHP.
- Must have strong experience in Magento Extension development.
- Write well-engineered source code that complies with accepted web standards.
- Strong experience of Magento Best Practices, including experience developing custom extensions and extending third party extensions.
- Thorough functional and code level knowledge of all Magento products and all relevant commerce technologies
- Good experience on Caching & Performance improvement.
![Enterprise Artificial Intelligence](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fdefault_company_picture.jpg&w=256&q=75)
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fdata_analytics.png&w=32&q=75)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fpython.png&w=32&q=75)
- 3+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd and streaming databases like druid
- Strong industry expertise with containerization technologies including kubernetes, docker-compose
- 2+ years of industry in experience in developing scalable data ingestion processes and ETLs
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- Experience with scripting languages. Python experience highly desirable.
- 2+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Demonstrated expertise of building cloud native applications
- Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
- Experience in API development using Swagger
- Strong expertise with containerization technologies including kubernetes, docker-compose
- Experience with cloud platform services such as AWS, Azure or GCP.
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
- Design and Implement Large scale data processing pipelines using Kafka, Fluentd and Druid
- Assist in dev ops operations
- Develop data ingestion processes and ETLs
- Design and Implement APIs
- Assist in dev ops operations
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fdata_analytics.png&w=32&q=75)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fpython.png&w=32&q=75)
Job Description
We are looking for an experienced engineer with superb technical skills. Primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Skills
- Bachelors/Masters/Phd in CS or equivalent industry experience
- Demonstrated expertise of building and shipping cloud native applications
- 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
- Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- 5+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
Responsibilities
- Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
- Create custom Operators for Kubernetes, Kubeflow
- Develop data ingestion processes and ETLs
- Assist in dev ops operations
- Design and Implement APIs
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fandroid.png&w=32&q=75)
Technical skills:
- Working experience in using the any of the design principles MVVM or MVP
- Experience in using Web Services and SQLite data base
- Experience in Multithreading concepts – Rx JAVA
- Experience in using version control systems like SVN, GIT Hub, Bitbucket etc
- Experience in using Dependency injection principles in android
- Experience in using design patterns
- Experience integration with 3rd party API’s like Facebook, gmail, retrofit, Picasso & 3rd party libraries
- Experience/Knowledge in using google Flutter
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Deep knowledge and working experience in any of Adobe Marketo or Hybris Marketing or Emarsys platforms with end to end hands-on implementation and integration experience including integration tools
- Other platforms of interest include Adobe Campaign Emarsys Pardot Salesforce Marketing Cloud and Eloqua
- Hybris Marketing or Adobe Marketo Certification strongly preferred
- Experience in evaluating different marketing platforms (e.g. Salesforce SFMC versus Adobe Marketo versus Adobe Marketing)
- Experience in design and setup of Marketing Technology Architecture
- Proficient in scripting and experience with HTML XML CSS JavaScript etc.
- Experience with triggering campaigns via API calls
- Responsible for the detailed design of technical solutions POV on marketing solutions Proof-of-Concepts (POC) prototyping and documentation of the technical design
- Collaborate with Onshore team in tailoring solutions that meets business needs using agile/iterative development process
- Perform feasibility analysis for the marketing solution that meet the business goals
- Experience with client discovery workshops and technical solutions presentations
- Excellent communication skills required as this is a client business & IT interfacing role
![AI Services &Digital Transformation](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fdefault_company_picture.jpg&w=256&q=75)
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Min 5 years of hands-on experience in Java Spring-boot technologies
- Experience with monolithic applications
- Experience using Redis and RabbitMQ
- Experience with RDBMS such as SQLServer and My SQL
- Strong analytical, problem solving and data analysis
- Excellent communication, presentation and interpersonal skills are a must
- Micro service frameworks such as Java SpringBoot
- Design and implement automated unit and integration tests
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Finfo_icon.png&w=48&q=75)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fnet.png&w=32&q=75)
As a Senior Technical lead you will be a member of a software development team building innovative new features for the application. Lead the team as Senior Full Stack developer
Primary Job Responsibilities:
- Inherit a well-architected, clean, and robust codebase built with .NET Core 5.x, and JavaScript and Angular
- Evaluate and implement new libraries and components
- Ensure best practices are followed in the design and development of the application Contribute to and help manage our open-source libraries
- Strong knowledge of C# and .NET Core, JavaScript, Angular
- NoSQL databases (MongoDB)Real-time web applications (Web Sockets / SignalR)
- Containerization technologies (Docker, Kubernetes) Swagger /OpenAPI Initiative
- Strong knowledge of design patterns and practices
- Experience in non-Microsoft tech stacks as Node, Python, Angular and others are also crucial components of Source Control - GitHub Unit / Performance / Load Testing
- Experience with Continuous Integration - Jenkins/Travis/Circle/etc.
- Experience working in Agile Environments - JIRA/Slack
- Working knowledge of SAAS Architecture and Deployments - AWS/Google Cloud/etc.
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fsearch.png&w=48&q=75)
Similar companies
About the company
Appknox, a leading mobile app security solution HQ in Singapore & Bangalore was founded by Harshit Agarwal and Subho Halder.
Since its inception, Appknox has become one of the go-to security solutions with the most powerful plug-and-play security platform, enabling security researchers, developers, and enterprises to build safe and secure mobile ecosystems using a system-plus human approach.
Appknox offers VA+PT solutions ( Vulnerability Assessment + Penetration Testing ) that provide end-to-end mobile application security and testing strategies to Fortune 500, SMB and Large Enterprises Globally helping businesses and mobile developers make their mobile apps more secure, thus not only enhancing protection for their customers but also for their own brand.
During the course of 9 years, Appknox has scaled up to work with some major brands in India, South-East Asia, Middle-East, Japan, and the US and has also successfully enabled some of the top government agencies with its On-Premise deployments & compliance testing. Appknox helps 500+ Enterprises which includes 20+ Fortune 1000 and ministries/regulators across 10+ countries and some of the top banks across 20+ countries.
A champion of Value SaaS, with its customer and security-first approach Appknox has won many awards and recognitions from G2, and Gartner and is one of the top mobile app security vendors in its 2021 Application security Hype Cycle report.
Our forward-leaning, pioneering spirit is backed by SeedPlus, JFDI Asia, Microsoft Ventures, and Cisco Launchpad and a legacy of expertise that began at the dawn of 2014.
Jobs
5
About the company
Jobs
4
About the company
Jobs
3
About the company
India's one of the best online learning platform for Math. Our mission statement: "Math bole toh MathonGo"
Jobs
1
About the company
Jobs
1
About the company
Jobs
2
About the company
Jobs
0
About the company
Unravel Carbon is the climate platform helping companies with global supply chains make data-driven decisions.
Our platform provides enterprises with reliable insights on their emissions, enabling them to measure, reduce, and report with confidence.
Headquartered in Singapore, with offices in Australia and Vietnam, we serve customers across the globe.
We are trusted by companies across key sectors, including manufacturing, fashion, consumer goods, food and beverage, and financial services.
Our customers include leading enterprises such as Global Fashion Group, Big Dutchman, and AIA.
We are backed by some of the best investors, including Sequoia and Y Combinator.
Our calculation methodology is certified by TÜV Rheinland against the GHG Protocol corporate standard for Scope 1, 2, and 3 emissions, and the ISO 14064 series for GHG inventories.
We ensure data security with our ISO 27001 certification and SOC 2 Type 1 attestation.
At Unravel Carbon, we are driven by a powerful mission: to accelerate the world's progress toward a zero-carbon economy.
Our vision is to integrate climate-driven decisions into global business practices to leave a habitable world for future generations.
Jobs
0
About the company
Jobs
1
About the company
KrispCall is experiencing rapid growth, and we're looking for talented individuals to join us on this exciting journey. We offer a dynamic and rewarding work environment where you can learn, grow, and contribute to a product that's transforming the way businesses communicate. We value our employees and provide opportunities for professional development and advancement. Explore our open positions and discover your potential at KrispCall!
Jobs
2