Core Java - Multithreading, Exception handling, Garbage collection.
at OpexAI

About OpexAI
About
We at opexAI strive to make business strategies and provide effective solutions to your complex business problems using AI, machine learning and cognitive computing approaches. With a 40+ years experience in analytics and a workforce of highly skilled and certified consultants, we provide a realistic approach to help you chart an optimal path to success.
Connect with the team
Company social profiles
Similar jobs
Roles and Responsibilities:
• Own development, design, scaling, and maintenance of application and messaging engines that power the central platform of Capillary's Cloud CRM product.
• Work on the development of AI and data science products for various use cases. Implement PoCs in Python, and Spark-Scala and productize the implementations.
• Contribute to overall design and roadmap.
• Mentor Junior team members.
Required Skills:
• Innovative and self-motivated with a passion to develop complex and scalable applications.
• 3+years of experience in software development with a strong focus on algorithms and data structures.
• Strong coding and design skills with prior experience in developing scalable & high-availability applications. Expertise in using Core Java/J2EE or Node.js
• Work experience with Relational databases and Non-Relational is required (Primarily MySQL, MongoDB, and Redis)
• Familiarity with big data platforms (like Spark-Scala) is an added plus.
• Strong Analytical and Problem Solving Skills.
• BTech from IIT or BE in computer science from a top REC/NIT.
Job Perks
• Competitive Salary as per market standards
• Flexible working hours
• Chance to work with a world class engineering team.
Why Join Us:
Be part of a fast-moving tech team building impactful, user-friendly apps with modern development practices and a collaborative work culture.
Capillary is an Equal Opportunity Employer and will not discriminate against any applicant for employment on the basis of race, age, religion, sex, veterans, individuals with disabilities, sexual orientation, or gender identity.
Disclaimer:
It has been brought to our attention that there have recently been instances of fraudulent job offers, purporting to be from Capillary Technologies. The individuals or organizations sending these false employment offers may pose as a Capillary Technologies recruiter or representative and request personal information, purchasing of equipment or funds to further the recruitment process or offer paid training. Be advised that Capillary Technologies does not extend unsolicited employment offers. Furthermore, Capillary Technologies does not charge prospective employees with fees or make requests for funding as a part of the recruitment process.
We commit to an inclusive recruitment process and equality of opportunity for all our job applicants.
Job Description:
Look for candidates who are preferably immediate joiners and have stability in career
Skills - DS Algorithms, Oops Concept,
Job Brief-
· Understand product requirements and come up with solution approaches
· Build and enhance large scale domain centric applications
· Deploy high quality deliverables into production adhering to the security, compliance and SDLC guidelines
● You have good understanding of the fundamentals of data science/algorithms or
software engineering
● Preferably you should have done some project or internship related to the field
● Knowledge of SQL is a plus
● A deep desire to learn new things and be a part of a vibrant start-up. You will
have a lot of freehand and this comes with immense responsibility - so it is
expected that you will be willing to master new things that come along!
What you will get to do?
● Build cloud-based services and/or user interfaces
● Participating in all aspects of software development activities, including design,
coding, code review, unit testing, bug fixing, and code/API documentation
● Be the first few members of a growing technology team
At Egnyte we develop content governance and collaboration products that are deployed across several large companies such as Yamaha and Red bull. The Egnyte platform supports daily, business critical operations for a million-plus user base interacting with a multi-petabyte content set.
We store, analyze, organize, and secure billions of files and petabytes of data with millions of users. We observe more than 1M API requests per minute on average. To make that possible and to provide the best possible experience, we rely on great engineers. For us, people who own their work from start to finish are integral. Our Engineers are part of the process from design to code, to test, to deployment, and back again for further iterations.
We’re looking for Senior Software Engineers and he should be able to take a complex problem and work with product managers, devops and other team members to execute end to end.
- Design and develop scalable cloud components that seamlessly integrates with on-premises systems.
- Challenge and redefine existing architecture or make 10x improvements in performance and scalability.
- Ability to foresee post-deployment design challenges, performance and scale bottlenecks.
- Hire and mentor junior engineers
- Doing code reviews, unit and performance testing of the code.
- Monitor and manage 3000+ nodes using modern DevOps tools and APM solutions.
- Demonstrated success designing and developing complex cloud based solutions
- Solid CS fundamentals with one or more areas of deep knowledge
- Experience with the following technologies: Java, SQL, Linux, Python, Nginx, Haproxy, BigQuery, HBase, New Relic, memcache, Elasticsearch, docker.
- Data driven decision process
- Relies on automation testing instead of manual QA
- Experience in working with Google cloud, AWS or Azure is preferred
We would prefer the candidate work from our Mumbai office for alteast first 6 months.
Software Developer
Roles and Responsibilities
- Apply knowledge set to fetch data from multiple online sources, cleanse it and build APIs on top of it
- Develop a deep understanding of our vast data sources on the web and know exactly how, when, and which data to scrap, parse and store
- We're looking for people who will naturally take ownership of data products and who can bring a project all the way from a fast prototype to production.
Desired Candidate Profile
- At Least 1-2 years of experience
- Strong coding experience in Python (knowledge of Javascripts is a plus)
- Strong knowledge of scraping frameworks in Python (Request, Beautiful Soup)
- Experience with SQL and NoSQL databases
- Knowledge in Version Control tools like GIT.
- Good understanding and hands-on with scheduling and managing tasks with cron.
Nice to have:
- Experience of having worked with elastic search
- Experience with multi-processing, multi-threading, and AWS/Azure is a plus
- Experience with web crawling is a plus
- Deploy server/related components to staging, live environments.
- Experience with cloud environments like AWS,etc as well as cloud solutions like Docker,Lambda, etc
- Experience in DevOps and related practices to improve development lifecycle, continuous delivery with high quality is an advantage.
- Compile and analyze data, processes, and codes to troubleshoot problems and identify areas for improvement
Profile: Java developer (2-4 years) - Noida
MUST Have: Core Java, data structure
NP: Immediate to 30 Days
Salary : Upto 9 LPA
MUST Haves: Java 1.8, Core Java, Data structure
Requirements:
- Java 1.8 version with Collections, Multithreading working knowledge –(Mandatory)
- Data Structures, BPM-Alfresco & Groovy
Responsibilities :
- Design and develop scalable, high performance, and reliable API driven services/applications in Java language that operate whole day.
- Produce high quality software that is unit tested, code reviewed, and checked in regularly for continuous integration.
- Interact with both business, financial research, and technical stakeholders to deliver high quality products and services that meets/exceeds business, customer, and technical requirements.
- Own products and code from cradle to grave including production quality/performance monitoring.
Required Skills :
- Experience building and running high performance enterprise applications developed in Java language.
- Hands on experience developing Java web applications built with modern standards-based APIs including Java based REST APIs and implementations.
- Experience in Java language APIs, Spring technologies, Hibernate, JDBC, RDBMS and NoSQL based data persistence.
- Experience developing solutions on AWS Cloud leveraging AWS Lambda, S3, DynamoDB, Cloud Formation, and other related AWS technologies.
- Solid understanding of Domain Modeling with Relational and Non-Relational Databases.
- Excellent debugging and problem-solving skills, including ability to diagnose and fix performance and scalability issues.
- Experience working in an agile methodology-based development environment.
- Excellent written and verbal communication skills.
- BS/B.Tech/M.Tech in Computer Science or a related field.
Share your resumes if this opportunity suits you.

Preferred Skills:
We want to really emphasize Spring Boot (2+ years although 1+ if candidate particularly strong) Using Redis as a caching technology with Spring Boot would be a strong plus
Using Redisson (a particular java client library that can be easily configured with Spring Boot) would be a strong plus
Knowledge of event based messaging systems (Amazon SNS, Amazon MQ, or Kafka (in AWS) Data Cleaning tools and techniques in CSV and Excel
Strong Knowledge of Spring Boot Dependency Injection and Configuration
Experience with APIs for popular e-commerce platforms (Magento, Shopify, Big Commerce, etc.)
SDLC (Software Development Lifecycle) Tools in the context of AWS. (Tools classified under DevOps)
Experience with managing AWS EC2 VM instances and using AWS managed Services (like S3, MySQL, VPC/Networking, Lambda, etc)
Performance Analysis Tools (Code Profiling) on Java VM and particularly Spring Boot
Experience in the development of Workflow or Business Process ApplicationsNice to Have:
Experience with Cassandra or MongoDB with Spring Boot
Horizontal Scaling with Spring Boot (considerations running multiple instances of Spring Boot instances)
Experience with placing Spring Boot applications in Docker/Kubernetes container ecosystems (especially in AWS)
Search technologies such as Lucene/SolrStrong experience in Core java technologies
Good experience in Restful Webservices
Good Experience in Database concepts.
Good Communication Skills & client/customer interaction experience.
2. Demonstrable abilities to optimize code. Strong analytical skills for effective problem solving
3. Experience in Java Spring, Hibernate, Github. Spring boot optional








