

About OpexAI
About
We at opexAI strive to make business strategies and provide effective solutions to your complex business problems using AI, machine learning and cognitive computing approaches. With a 40+ years experience in analytics and a workforce of highly skilled and certified consultants, we provide a realistic approach to help you chart an optimal path to success.
Connect with the team
Company social profiles
Similar jobs


•Should know about OOPS Concept, Core java, and basic Android
•Able to Design, Develop, Test & Implement an Android Application
•Basic knowledge of Javascript, Jquery gets a chance to work on React Native.
•Understanding of Linux/Ubuntu, Web servers, Cross Browser compatibility.
•Strong knowledge of UI development.
•Knowledge of 3rd party APIs implementation, while iOS & Android app development is good.


Our client is a Leading Indian consumer electronics start-up and is one of the fastest-growing start-ups. They have also emerged as the fifth largest wearable brand globally. They are super passionate about the impact that they are making in people’s lives and are looking for a fellow self-starter to join our ambitious bunch.
Responsibilities -
• Support analytics roadmap and strategy by ensuring that the landscape is mature, resilient, and flexible to meet the business requirements without deviating from the overall IT vision
• Collaborate with businesses to identify opportunities, and pain areas, incubate analytics product ideas and deliver analytics applications in a digital native manner
• Hold the analytics program together
• Manage the analytics vertical end to end from setting up a platform grounds (a data lake equivalent) and deliver business value through analytics for starters and gradually by bringing in intelligence in everything we do.
• Play a key role in transforming the company into an intelligent enterprise.
• Deliver analytics use cases that directly contributes to our top-line growth
• Be responsible for overall data hygiene, governance, monitoring of data pipelines and
continuously delivering business value through data
Requirements -
• 5 - 7 years experience in a similar role
• Bachelor’s degree in computer science/ analytics
• Master’s degree (business administration/equivalent) with a focus on Analytics, data science
• Good verbal and written communication skills
• Good networking skills
• Thrives in a dynamic, unstructured environment and transcends job boundaries and descriptions.
Sr.DevOps Engineer (5 to 8 yrs. Exp.)
Location: Ahmedabad
- Strong Experience in Infrastructure provisioning in cloud using Terraform & AWS CloudFormation Templates.
- Strong Experience in Serverless Containerization technologies such as Kubernetes, Docker etc.
- Strong Experience in Jenkins & AWS Native CI/CD implementation using code
- Strong Experience in Cloud operational automation using Python, Shell script, AWS CLI, AWS Systems Manager, AWS Lamnda, etc.
- Day to Day AWS Cloud administration tasks
- Strong Experience in Configuration management using Ansible and PowerShell.
- Strong Experience in Linux and any scripting language must required.
- Knowledge of Monitoring tool will be added advantage.
- Understanding of DevOps practices which involves Continuous Integration, Delivery and Deployment.
- Hands on with application deployment process
Key Skills: AWS, terraform, Serverless, Jenkins,Devops,CI/CD,Python,CLI,Linux,Git,Kubernetes
Role: Software Developer
Industry Type: IT-Software, Software Services
FunctionalArea:ITSoftware- Application Programming, Maintenance
Employment Type: Full Time, Permanent
Education: Any computer graduate.
Salary: Best in Industry.Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes


Minimum of 6 years of experience in software development. Proficiency in developing declarative and component-based SPAs using Vue.js. Experience with front-end technologies such as HTML, CSS, JavaScript, and TypeScript. Expertise in Vue.js, including Vuex for state management and Vue Router for navigation. Strong understanding of Java and experience with Spring Boot framework. Knowledge of RESTful APIs and microservices architecture. Knowledge of Spring Security framework for implementing authentication and authorization in Spring-based applications. Experience with Keycloak and access management solutions for securing web applications. Understanding of OAuth and OpenID Connect protocols for single sign-on (SSO) authentication. Ability to integrate and configure Keycloak with Spring Boot applications to manage user authentication and authorization. Experience with JSON Web Tokens (JWT) for secure transmission of authentication data between client and server. Bachelor's degree in Computer Science, Engineering, or related field.
Greeting from Uplogic Technologies Pvt Ltd!!!
We are looking for a dynamic and organized professional to our team in Madurai Branch who are excel in everything he or she do and interested in learn and grow along with Uplogic Team and enhance their hands in the development of Organization. If you are interested, share your updated profile.
Experience: 1 to 5 years
Designation: SEO Analyst
Job Location: Madurai
Job Description
- Must have sound Knowledge in On-page Optimization and Off-page Optimization
- Should have hands-on experience in Link building strategies
- Must be strong in Organic SEO
- Exposure in content writing and Blog writing is added value
- Knowledge in overall Digital marketing is good to have
- Should have strong interpersonal and communication skill
And it's highly appreciated if you can refer any of your friends, whose profile is suitable for the same.


Candidate should have full ownership of the project. Huge learning opportunity
Profile: DevOps Engineer
Experience: 5-8 Yrs
Notice Period: Immediate to 30 Days
Job Descrtiption:
Technical Experience (Must Have):
Cloud: Azure
DevOps Tool: Terraform, Ansible, Github, CI-CD pipeline, Docker, Kubernetes
Network: Cloud Networking
Scripting Language: Any/All - Shell Script, PowerShell, Python
OS: Linux (Ubuntu, RHEL etc)
Database: MongoDB
Professional Attributes: Excellent communication, written, presentation,
and problem-solving skills.
Experience: Minimum of 5-8 years of experience in Cloud Automation and
Application
Additional Information (Good to have):
Microsoft Azure Fundamentals AZ-900
Terraform Associate
Docker
Certified Kubernetes Administrator
Role:
Building and maintaining tools to automate application and
infrastructure deployment, and to monitor operations.
Design and implement cloud solutions which are secure, scalable,
resilient, monitored, auditable and cost optimized.
Implementing transformation from an as is state, to the future.
Coordinating with other members of the DevOps team, Development, Test,
and other teams to enhance and optimize existing processes.
Provide systems support, implement monitoring and logging alerting
solutions that enable the production systems to be monitored.
Writing Infrastructure as Code (IaC) using Industry standard tools and
services.
Writing application deployment automation using industry standard
deployment and configuration tools.
Design and implement continuous delivery pipelines that serve the
purpose of provisioning and operating client test as well as production
environments.
Implement and stay abreast of Cloud and DevOps industry best practices
and tooling.
**Sr. Backend Developer Responsibilities:**
* Commitment towards delivering features on the estimated time
* Follow standard code guidelines for writing code and code review of your fellow team member's code
* Strong understanding of REST Framework
* Designing and developing REST APIs
* Ensuring scalability of code written
**Sr. Backend Developer Requirements:**
* Strong understanding of Node or any other equivalent language
* Strong understanding of database technology such as MySQL and MongoDB
* Good understanding of AWS, Redis, ElasticSeacrh, Newrelic, sentry, etc
* Have experience in monitoring and managing production level systems
* Degree in Computer Science
* Excellent verbal communication skills
* Good problem-solving skills
* Attention to detail
**About Easy Eat**
Easy Eat is reimagining the experience of dining in at a restaurant. We're starting with Malaysia and South East Asia but we believe, 5 years from now, you'll experience it in the same way across the world.
We've achieved Product Market Fit and are growing rapidly across geographies. We crossed the $10M annual run rate in just 12 months and we're really excited for the journey ahead.
**About the team**
We're a group of experienced founders who have *been there, done that* in the past. Raised funds from global VCs in industry. Have operated at scale, know how to leverage data and grow culture and teams.

- In-depth knowledge in Core Python with Django building end to endapplications development.
- Experience in Web technologies-HTML, CSS, Javascript.
- Database - SQL Server/Postgres/ NoSQL database.
- Good understanding of Algorithms, data structures.
- Knowledge in ORM (Object Relational Mapper) libraries.
- Experience in integrating multiple data sources and databases into onesystem.
- Knowledge in REST / SOAP API
- Knowledge in version control tools like Git
- Experience with various cloud technologies.

