Qualification : B.E/B.Tech/MCA with 70% aggregate and passed out in 2020 /2021
Salary : Rs. 2.4LPA starting revised half yearly based on performance.
Skills : Programming with C++/JAVA/SWIFT/Angular/QA (Anyone)
Job Location : Marathahalli, Bangalore
About K RIDE
Similar jobs
Are you an IT professional with 1.5-3 years of experience with strong fundamentals in any programming language? If yes, Apply now!
Position: Programmer Analyst
CTC: 8LPA to 10LPA
Interview Process:
1. Eligibility Check
2. Technical and HR Round of Interviews
3. A letter (LoI) from the Client to kick start the Technical Grooming Session
4. Evaluation
5. Offer letter
About Us:
Merkle, a dentsu company, is a leading data-driven customer experience management (CXM) company that specializes in the delivery of unique, personalized customer experiences across platforms and devices. For more than 30 years, Fortune 1000 companies and leading nonprofit organizations have partnered with Merkle to maximize the value of their customer portfolios. The company’s heritage in data, technology, and analytics forms the foundation for its unmatched skills in understanding consumer insights that drive hyperpersonalized marketing strategies. Its combined strengths in consulting, creative, media, analytics, data, identity, CX/commerce, technology, and loyalty & promotions drive improved marketing results and competitive advantage. With more than 14,000 employees, Merkle is headquartered in Columbia, Maryland, with 50+ additional offices throughout the Americas, EMEA, and APAC.
Role Description:
Ensure successful project completion by understanding client expectations, overseeing operations on key projects and coaching a team of 7 - 10. It's an opportunity to work with top 10 market research firms, or large corporations, leverage Ugam's proven project management frameworks, and collaborate with some of the most experienced in market research
Key Responsibilities: AEM Developer:
• Experience implementing Adobe AEM (AEM 6.4 or higher)
• Design, develop and support AEM content fragments, integrations, and applications
• Experience with secure coding practices
• Experience implementing web content management systems in a large corporate environment.
• Experience with object-oriented design and design patterns
• Excellent estimation abilities in scoping work in a highly complex environment
• Experience with dispatcher, caching
• Experience with OSGI Bundles.
• Hands on experience with Java/J2EE, Struts, Hibernate, JavaScript, HTML, CSS, CRXDE, JCR, sling models, sling servlets, OSGI configurations,
• AEM dispatcher configuration, workflows, content fragment models, dynamic templates, and components creation.
• OSGI Architecture Sling Servlets and Sling Framework
• Strong experience in Adobe CQ5/AEM6.x Content Management (CMS)
• Experience of creating OSGI bundles, AEM Templates/Components, Workflow
• Identifying and documenting functional and technical requirements.
• Good in Java/J2EE skills and OOPS with proficiency design patterns
• Good understanding and work experience of Java Frameworks (like Struts, Spring etc.) and knowledge of CRX and Apache Sling.
• Should have experience in designing modules, templates and components.
• Exposure to more than one end to end implementation on AEM Qualifications AEM Certification is an add on
What are we looking for?
At least 3 years of experience in AEM Knowledge and experience in Sling Model, JAVA, Workflow, Sites, Assets, AEM Cloud, Servlets, OSGI Configurations, content fragment, Workflow, etc.
We thank you for sending us your profile & details for this role. If your profile gets shortlisted for roles with us, we look forward to a conversation with you to learn about more of what makes you awesome!
TOP 3 SKILLS
Python (Language)
Spark Framework
Spark Streaming
Docker/Jenkins/ Spinakar
AWS
Hive Queries
He/She should be good coder.
Preff: - Airflow
Must have experience: -
Python
Spark framework and streaming
exposure to Machine Learning Lifecycle is mandatory.
Project:
This is searching domain project. Any searching activity which is happening on website this team create the model for the same, they create sorting/scored model for any search. This is done by the data
scientist This team is working more on the streaming side of data, the candidate would work extensively on Spark streaming and there will be a lot of work in Machine Learning.
INTERVIEW INFORMATION
3-4 rounds.
1st round based on data engineering batching experience.
2nd round based on data engineering streaming experience.
3rd round based on ML lifecycle (3rd round can be a techno-functional round based on previous
feedbacks otherwise 4th round will be a functional round if required.
Wolken Software provides a suite of AI-enabled, SaaS 2.0 cloud-native applications for Customer Service and Enterprise Solutions namely Wolken Service Desk, Wolken's IT Service Management, and Wolken's HR Case Management. We have replaced incumbents like Salesforce, ServiceNow Zendesk, etc. at various Fortune 500 and Fortune 1000 companies.
Job Description:
- 0-2 years exp in Product Support Role.
- Need to Understand the product and features, to demonstrate them to the end users.
- Sound knowledge of Java and SQL
- Need to Analyze and Resolve the Product specific queries, MySQL.
- Consistently Deliver customer requirements.
- Engage with the Development team and manage the progress of cases.
- Should be able to cope with a high-pressure work environment.
- The working model will be 24X7.
Must require
- Good Communication both verbal and written
- Analytical skills
- Team player
Requirements:
- At least 2+ years of Handson in developing solutions on Microsoft Azure Platform
- 5+ years of industry experience designing and developing enterprise scale services & platforms
- 5+ years of experience in C# with solid analytical, debugging, and problem-solving skills
- At least 1.5 year of hands-on experience with React JS/ Typescript /Bootstrap/Angular related frameworks
- At least 3+ Year Hands on in Relational Database SqlServer/ Azure SQL.
Overall 11+ years of experience, with at-least 2+ years in Architecture. Hands-on and able to go deep into the code, if needed.
Mandatory Architecture & Design Skills: Design Pattern, Scalablity, Security, Resiliency, Micro-Services.
Mandatory Technology Skills: Kafka, Spring WebFlux, Spring Reactor, NoSQL (at least 1 NoSQL database experience is must), Java 8+, JWT, Spring Boot, Jmeter, New Relic, JVisualVM, JVM tuning.
Desirable Technology Skills: Couchbase, AWS, Docker, JProfiler, ELK, HDFS, HBase, Kubernetes. docker
Job Responsibilities
Mentor the team and guide to resolve the issues
- design
- Scalability testing
- Security
- Microservices
- Java8
- Java Workflow Tooling (JWT)
- Jmeter
- New Relic
- Spring WebFlux
- Kafka
- NoSQL DB
- Spring Boot
Keyskills - Nice to Have
- Couchbase
- Amazon Web Service (AWS)
- Docker
- ELK
- Kubernetes
2. Delivering the well-documented and stable code within the assigned deadline
3. Working on Prolog (logic-based language) and coding
About Tibco
Headquartered in Palo Alto, CA, TIBCO Software enables businesses to reach new heights on their path to digital distinction and innovation. From systems to devices and people, we interconnect everything, capture data in real time wherever it is, and augment the intelligence of organizations through analytical insights. Thousands of customers around the globe rely on us to build compelling experiences, energize operations, and propel innovation. Our teams flourish on new ideas and welcome individuals who thrive in transforming challenges
into opportunities. From designing and building amazing products to providing excellent service;we encourage and are shaped by bold thinkers, problem-solvers, and self-starters. We are always adapting and providing exciting opportunities for our employees to grow, learn and excel.
We value the customers and employees that define who we are; dynamic individuals willing to take the risks necessary to make big ideas come to life and who are comfortable collaborating in our creative, optimistic environment. TIBCO – we are just scratching the surface.
Who You’ll Work With
TIBCO Data Virtualization (TDV) is an enterprise data virtualization solution that orchestrates access to multiple and varied data sources, delivering data sets and IT curated data services to any analytics solution. TDV is a Java based enterprise-grade database engine supporting all phases of data virtualization development, run-time, and management. It is the trusted solution of choice for the top enterprises in verticals like finance, energy, pharmaceutical, retail, telecom
etc. Are you interested in working on leading edge technologies? Are you fascinated with Big Data,Cloud, Federation and Data Pipelines? If you have built software frameworks and have a background in Data Technologies, Application Servers, Business Intelligence etc this opportunity is for you.
Overview
TIBCO Data Virtualization team is looking for a engineer with experience in the area of SQL Data Access using JDBC, WebServices, and native client access for both relational as well as non-relational sources. You will have expertise in developing metadata layer around disparate data sources and implementing a query runtime engine for data access, including plugin management. The core responsibilities will include designing, implementing and maintaining the
subsystem that abstracts data and metadata access across different relational database flavors, BigData sources, Cloud applications, enterprise application packages like SAP R/3, SAP BW, Salesforce etc. The server is implemented by a multi-million line source base in Java, so the ability to understand and integrate with existing code is an absolute must. The core runtime is a complex multi-threaded system and the successful candidate will demonstrate complete expertise in handling features geared towards concurrent transactions in a low latency, high throughput and scalable server environment. The candidate will have the opportunity to work in a collaborative environment with leading database experts in building the most robust, scalable and high performing database server.
Job Responsibilities
• In this crucial role as a Data Source Engineer, you will:
• Drive enhancements to existing data-source layer capabilities
• Understand and interface with 3rd party JDBC drivers
• Ensure all security-related aspects of driver operation function with zero defects
• Diagnose customer issues and perform bug fixes
• Suggest and implement performance optimizations
Required Skills
• Bachelor’s degree with 3+ years of experience, or equivalent work experience.
• 3+ years programming experience
• 2+ years of Java based server side experience
• 1+ years experience with at least one of JDBC, ODBC, SOAP, REST, and OData
• 1+ years of multithreading experience
• Proficiency in both spoken and written communication in English is a must
Desired Skills
• Strong object-oriented design background
• Strong SQL & database background
• Experience developing or configuring cloud-based software
• Experience with all lifecycle aspects of enterprise software
• Experience working with large, pre-existing code bases
• Experience with enterprise security technologies
• Experience with any of the following types of data sources: Relational, Big Data, Cloud, Data
Lakes, and Enterprise Applications.
• Experience using Hive, Hadoop, Impala, Cloudera, and other Big Data technologies