
Guidewire developer-PAN India
Role : Guidewire developer
Location : Pan India
Years of experience : 3 - 15 years
Roles and Responsibilities :
- Candidate having at least 3+ years of experience as guidewire developer
- Candidate having experience in Policy center(PC) or Claim center(CC)
- Experience in configuration or integration
- Candidate having Specialist certificate of guidewire
Regards
Sundaravalli

Similar jobs
Develop and implement link building strategy
Good written skills
Research and analyze competitor backlinks
Leading keyword research and optimization of content
Should be able to support on other digital marketing activities like On-Page SEO, Social Media Optimization (SMO) and Content Writing
Knowledge in Guest Post Outreach
Assisting with blog content
Should be result orientated
What the company wants:
They want someone who can build and manage relationships with real estate brokers. You’ll be the main point of contact for brokers, onboard new ones into the network, and work with them to bring in buyers and properties. Your job is to make sure brokers stay engaged, motivated, and consistently deliver results.
Role:
- Be the single point of contact (POC) for all brokers.
- Onboard new brokers into the company’s channel partner network.
- Build and maintain strong, long-term broker relationships.
- Work with brokers to bring in qualified buyers for listed properties.
- Encourage brokers to source high-potential properties for the company.
- Track broker performance and give feedback for improvement.
- Conduct regular broker training to align them with company processes.
- Stay updated on market trends and identify new opportunities.
- Travel frequently to meet brokers and keep them engaged.
About CodersBrain
CodersBrain is a leading consultancy firm that offers offshore development services to clients all over the world. They provide end-to-end digital and business solutions that partner with clients to simplify, strengthen, and transform their businesses. They have extensive experience working with enterprises, software companies, SMEs, and startups.
Position Overview
Job Title: Java FS Developer with React
Experience: 3 to 9 Years
Location: Hyderabad / Bangalore / Chennai
Notice Period: Immediate to 15 Days
Required Skills and Experience
- Java Full Stack Developer with 3-9 years of experience
- Strong experience with Java 8 and strings
- Hands-on experience in React JS
I
Responsibilities:
Respond to customer queries in a timely and accurate way, via phone.
Identify customer needs and help customers use specific features.
Inform customers about new features and functionalities.
Gather customer feedback and share with our Product, Sales and Marketing teams.
Shift Timing: 1pm-9.30pm (UK)
8pm-5am (US)
** Must have excellent communication skill in English .
Freshers also can apply.
Interview Mode: Face to face
Contact: Sanjukta (HR) @ 90381^69266/ sanjukta.seal at the rate adretsoftware.com
Mandatory Requirements
- Strong MVCC frontend development experience in Angular, React, Vue, Angularjs, REDUX, or similar frameworks
- Strong foundation skills in HTML, CSS, SASS, LESS, SCSS, Javascript, OO Javascript (OOJS), and JQuery
- Responsive design techniques, highly proficient with strong security and performance optimization techniques.
- Experience building reusable components using modern techniques
- Experience building PWA and Hybrid applications using Ionic, React Native, etc. is a plus
- If you don’t have experience with MVCC frameworks like Angular, we will still consider if your fundamental skills (HTML, CSS, JS) are strong
Job Description:-
- Proven experience with coding using JAVA/J2EE.
- Experience developing MultiChannel responsive web application.
- experience working with Windows, Unix/Linux Operating system environments.
- Familiarity with common stacks.
- Experience/Knowledge of multiple front - end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery).
- Excellent communication and teamwork skills.
- Willingness to travel within India or aboard for short term or long term would an added advantage.
Job Responsibilities:
* Design, build, and maintain efficient, reusable, and reliable Java code
* Ensure the best possible performance, quality, and responsiveness of the applications
* Identify bottlenecks and bugs, and devise solutions to these problems
* Help maintain code quality, organization, and automatization
* Prepare the technical design of complex technology components
Mandatory Skills:
* Proficient in Java (JDK 1.7) or above JDK 8 preferable, with a good knowledge of its ecosystems with a knack for writing clean, readable Java code, writing reusable Java libraries along with knowledge of concurrency patterns in Java
* Solid understanding of object-oriented programming along with various design and architectural patterns
* Hands on experience with Spring, Spring Boot, JUnit
* Familiarity with concepts of MVC, Microservices, RESTful
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.









