Purple Hirez
http://www.purplehirez.comAbout
Connect with the team
Jobs at Purple Hirez
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are looking for a experienced engineer with superb technical skills. You will primarily be responsible for architecting and building large scale systems that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions. The successful candidate will be curious, creative, ambitious, self motivated, flexible, and have a bias towards taking action. As part of the early engineering team, you will have a chance to make a measurable impact as well as having a significant amount of responsibility.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Who you are:
- Bachelors/Masters/Phd in CS or equivalent industry experience
- Expert level knowledge in Python
- Expert level of Atleast one Web Framework such As Django, Flask, FastAPI. FastAPI Preferred.
- Understand and implement Microservices, Restful APIs, distributed systems using Cloud Native Principles
- Knowledge and Experience Integrating and Contributing to Open Source Projects and Frameworks.
- Repeated Experience Building Secure, Reliable, Scalable Platforms
- Experience With Data Control Patterns And ORM Libraries
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
What you will do:
- Co-Lead Design And Architecture of our core platform components that powers AI/ML Platform
- Integrate Data Storage Solutions Including RDBMS And Key-Value Stores
- Write Reusable, Testable, And Efficient Code
- Design And Implementation Of Low-Latency, High-Availability, Scalable and Asynchronous Applications
- Integration Of User-Facing Elements Developed By Front-End Developers With Server Side Logic
- Design Internal And Customer Facing API’s
- Support Enterprise Customers During Implementation And Launch Of Service
Experience:
- Python: 5 years (Required)
- FastAPI: 2 years (Preferred)
- REST: 5 years (Required)
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Be a part of the growth story of a rapidly growing organization in AI. We are seeking a passionate Machine Learning (ML) Engineer, with a strong background in developing and deploying state-of-the-art models on Cloud. You will participate in the complete cycle of building machine learning models from conceptualization of ideas, data preparation, feature selection, training, evaluation, and productionization.
On a typical day, you might build data pipelines, develop a new machine learning algorithm, train a new model or deploy the trained model on the cloud. You will have a high degree of autonomy, ownership, and influence over your work, machine learning organizations' evolution, and the direction of the company.
Required Qualifications
- Bachelor's degree in computer science/electrical engineering or equivalent practical experience
- 7+ years of Industry experience in Data Science, ML/AI projects. Experience in productionizing machine learning in the industry setting
- Strong grasp of statistical machine learning, linear algebra, deep learning, and computer vision
- 3+ years experience with one or more general-purpose programming languages including but not limited to: R, Python.
- Experience with PyTorch or TensorFlow or other ML Frameworks.
- Experience in using Cloud services such as AWS, GCP, Azure. Understand the principles of developing cloud-native application development
In this role you will:
- Design and implement ML components, systems and tools to automate and enable our various AI industry solutions
- Apply research methodologies to identify the machine learning models to solve a business problem and deploy the model at scale.
- Own the ML pipeline from data collection, through the prototype development to production.
- Develop high-performance, scalable, and maintainable inference services that communicate with the rest of our tech stack
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
- Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- 5+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
Responsibilities
- Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
- Create custom Operators for Kubernetes, Kubeflow
- Develop data ingestion processes and ETLs
- Assist in dev ops operations
- Design and Implement APIs
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Must possess a fair, clear understanding of fundamentals and concepts of Magento 1/2, PHP.
- Must have strong experience in Magento Extension development.
- Write well-engineered source code that complies with accepted web standards.
- Strong experience of Magento Best Practices, including experience developing custom extensions and extending third party extensions.
- Thorough functional and code level knowledge of all Magento products and all relevant commerce technologies
- Good experience on Caching & Performance improvement.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- 3+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd and streaming databases like druid
- Strong industry expertise with containerization technologies including kubernetes, docker-compose
- 2+ years of industry in experience in developing scalable data ingestion processes and ETLs
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- Experience with scripting languages. Python experience highly desirable.
- 2+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Demonstrated expertise of building cloud native applications
- Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
- Experience in API development using Swagger
- Strong expertise with containerization technologies including kubernetes, docker-compose
- Experience with cloud platform services such as AWS, Azure or GCP.
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
- Design and Implement Large scale data processing pipelines using Kafka, Fluentd and Druid
- Assist in dev ops operations
- Develop data ingestion processes and ETLs
- Design and Implement APIs
- Assist in dev ops operations
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Description
We are looking for an experienced engineer with superb technical skills. Primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Skills
- Bachelors/Masters/Phd in CS or equivalent industry experience
- Demonstrated expertise of building and shipping cloud native applications
- 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
- Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- 5+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
Responsibilities
- Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
- Create custom Operators for Kubernetes, Kubeflow
- Develop data ingestion processes and ETLs
- Assist in dev ops operations
- Design and Implement APIs
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Technical skills:
- Working experience in using the any of the design principles MVVM or MVP
- Experience in using Web Services and SQLite data base
- Experience in Multithreading concepts – Rx JAVA
- Experience in using version control systems like SVN, GIT Hub, Bitbucket etc
- Experience in using Dependency injection principles in android
- Experience in using design patterns
- Experience integration with 3rd party API’s like Facebook, gmail, retrofit, Picasso & 3rd party libraries
- Experience/Knowledge in using google Flutter
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Deep knowledge and working experience in any of Adobe Marketo or Hybris Marketing or Emarsys platforms with end to end hands-on implementation and integration experience including integration tools
- Other platforms of interest include Adobe Campaign Emarsys Pardot Salesforce Marketing Cloud and Eloqua
- Hybris Marketing or Adobe Marketo Certification strongly preferred
- Experience in evaluating different marketing platforms (e.g. Salesforce SFMC versus Adobe Marketo versus Adobe Marketing)
- Experience in design and setup of Marketing Technology Architecture
- Proficient in scripting and experience with HTML XML CSS JavaScript etc.
- Experience with triggering campaigns via API calls
- Responsible for the detailed design of technical solutions POV on marketing solutions Proof-of-Concepts (POC) prototyping and documentation of the technical design
- Collaborate with Onshore team in tailoring solutions that meets business needs using agile/iterative development process
- Perform feasibility analysis for the marketing solution that meet the business goals
- Experience with client discovery workshops and technical solutions presentations
- Excellent communication skills required as this is a client business & IT interfacing role
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Min 5 years of hands-on experience in Java Spring-boot technologies
- Experience with monolithic applications
- Experience using Redis and RabbitMQ
- Experience with RDBMS such as SQLServer and My SQL
- Strong analytical, problem solving and data analysis
- Excellent communication, presentation and interpersonal skills are a must
- Micro service frameworks such as Java SpringBoot
- Design and implement automated unit and integration tests
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
As a Senior Technical lead you will be a member of a software development team building innovative new features for the application. Lead the team as Senior Full Stack developer
Primary Job Responsibilities:
- Inherit a well-architected, clean, and robust codebase built with .NET Core 5.x, and JavaScript and Angular
- Evaluate and implement new libraries and components
- Ensure best practices are followed in the design and development of the application Contribute to and help manage our open-source libraries
- Strong knowledge of C# and .NET Core, JavaScript, Angular
- NoSQL databases (MongoDB)Real-time web applications (Web Sockets / SignalR)
- Containerization technologies (Docker, Kubernetes) Swagger /OpenAPI Initiative
- Strong knowledge of design patterns and practices
- Experience in non-Microsoft tech stacks as Node, Python, Angular and others are also crucial components of Source Control - GitHub Unit / Performance / Load Testing
- Experience with Continuous Integration - Jenkins/Travis/Circle/etc.
- Experience working in Agile Environments - JIRA/Slack
- Working knowledge of SAAS Architecture and Deployments - AWS/Google Cloud/etc.
Similar companies
Truein
About the company
Truein is a B2B SaaS company. We help organizations to enable face recognition-based attendance. Truein is designed for the contractual and distributed workforce use case. The solution helps to increase workforce productivity and plug attendance leakages
Jobs
1
Incubyte
About the company
Who we are
We are Software Craftspeople. We are proud of the way we work and the code we write. We embrace and are evangelists of eXtreme Programming practices. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus own quality. And most importantly, we never stop learning!
We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. We work with large established institutions to help them create internal applications to automate manual opperations and achieve scale.
We design software, design the team a well as the organizational strategy required to successfully release robust and scalable products. Incubyte strives to find people who are passionate about coding, learning and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim to bringing a product mindset into services. More on our website: https://www.incubyte.co/
Join our team! We’re always looking for like minded people!
Jobs
14
Sarvaha Systems Private Limited
About the company
Sarvaha founders have clearly recognized that what mattered to them during their professional journey so far were the people that they worked with and the problems they solved, more than the technology or the domain. It’s the people and challenges that make for a great work experience. We seek to attract not just good engineers but great individuals. We also realize that building a great team requires a lot of trust, patience, honesty and affection and serious money.
Sarvaha is a programmer centric company built by fun-loving programmers. Join us for highly challenging work, great environment and very attractive compensation.
Jobs
4
JoVE
About the company
Jobs
3
Universal Transit
About the company
Jobs
1
Nvelop
About the company
Jobs
1
Rohini IT Consulting LLP
About the company
Jobs
13
DataCaliper
About the company
Jobs
0
Unravel Carbon
About the company
Unravel Carbon is the climate platform helping companies with global supply chains make data-driven decisions.
Our platform provides enterprises with reliable insights on their emissions, enabling them to measure, reduce, and report with confidence.
Headquartered in Singapore, with offices in Australia and Vietnam, we serve customers across the globe.
We are trusted by companies across key sectors, including manufacturing, fashion, consumer goods, food and beverage, and financial services.
Our customers include leading enterprises such as Global Fashion Group, Big Dutchman, and AIA.
We are backed by some of the best investors, including Sequoia and Y Combinator.
Our calculation methodology is certified by TÜV Rheinland against the GHG Protocol corporate standard for Scope 1, 2, and 3 emissions, and the ISO 14064 series for GHG inventories.
We ensure data security with our ISO 27001 certification and SOC 2 Type 1 attestation.
At Unravel Carbon, we are driven by a powerful mission: to accelerate the world's progress toward a zero-carbon economy.
Our vision is to integrate climate-driven decisions into global business practices to leave a habitable world for future generations.
Jobs
0
OIP Robotics
About the company
OIP Robotics is an innovative business technology company that was founded on the belief that one company can change the whole face of the insurance business as we know it. OIP Robotics stems from Outsource Insurance Professionals. Our goal is to provide outstanding and innovative tech solutions and services by combining top insurance talent, cultivated within the leading insurance KPO, with top tech talent. We deliver a range of services surrounding full-cycle automation of insurance processes, software development, staff augmentation, and IT support. One of the leading OIPR values is that we look at insurance software with insurance eyes and technology brains. What makes our employees inspired and motivated is working in a safe and agile environment, where everyone's idea is supported and more than welcome.
Jobs
3