Location : Pune
Notice Period : Immediate to 15days
JD
- Expertise Area – CMS is must
- Preference – Prior SBIC working experience

About codersbrain
About
Connect with the team
Similar jobs
Qualifications:
- Proven experience in sales, preferably in a branding, marketing, or creative agency environment. Ideally 2-4 years
- Strong understanding of branding principles, design, and marketing strategies.
- Excellent communication and presentation skills.
- Results-driven with a track record of meeting or exceeding sales targets.
- Ability to build and maintain strong client relationships.
- Bachelor's degree in Business, Marketing, or a related field is preferred.
If interested kindly share your updated resume at 82008 31681
About Tibco
Headquartered in Palo Alto, CA, TIBCO Software enables businesses to reach new heights on their path to digital distinction and innovation. From systems to devices and people, we interconnect everything, capture data in real time wherever it is, and augment the intelligence of organizations through analytical insights. Thousands of customers around the globe rely on us to build compelling experiences, energize operations, and propel innovation. Our teams flourish on new ideas and welcome individuals who thrive in transforming challenges
into opportunities. From designing and building amazing products to providing excellent service;we encourage and are shaped by bold thinkers, problem-solvers, and self-starters. We are always adapting and providing exciting opportunities for our employees to grow, learn and excel.
We value the customers and employees that define who we are; dynamic individuals willing to take the risks necessary to make big ideas come to life and who are comfortable collaborating in our creative, optimistic environment. TIBCO – we are just scratching the surface.
Who You’ll Work With
TIBCO Data Virtualization (TDV) is an enterprise data virtualization solution that orchestrates access to multiple and varied data sources, delivering data sets and IT curated data services to any analytics solution. TDV is a Java based enterprise-grade database engine supporting all phases of data virtualization development, run-time, and management. It is the trusted solution of choice for the top enterprises in verticals like finance, energy, pharmaceutical, retail, telecom
etc. Are you interested in working on leading edge technologies? Are you fascinated with Big Data,Cloud, Federation and Data Pipelines? If you have built software frameworks and have a background in Data Technologies, Application Servers, Business Intelligence etc this opportunity is for you.
Overview
TIBCO Data Virtualization team is looking for a engineer with experience in the area of SQL Data Access using JDBC, WebServices, and native client access for both relational as well as non-relational sources. You will have expertise in developing metadata layer around disparate data sources and implementing a query runtime engine for data access, including plugin management. The core responsibilities will include designing, implementing and maintaining the
subsystem that abstracts data and metadata access across different relational database flavors, BigData sources, Cloud applications, enterprise application packages like SAP R/3, SAP BW, Salesforce etc. The server is implemented by a multi-million line source base in Java, so the ability to understand and integrate with existing code is an absolute must. The core runtime is a complex multi-threaded system and the successful candidate will demonstrate complete expertise in handling features geared towards concurrent transactions in a low latency, high throughput and scalable server environment. The candidate will have the opportunity to work in a collaborative environment with leading database experts in building the most robust, scalable and high performing database server.
Job Responsibilities
• In this crucial role as a Data Source Engineer, you will:
• Drive enhancements to existing data-source layer capabilities
• Understand and interface with 3rd party JDBC drivers
• Ensure all security-related aspects of driver operation function with zero defects
• Diagnose customer issues and perform bug fixes
• Suggest and implement performance optimizations
Required Skills
• Bachelor’s degree with 3+ years of experience, or equivalent work experience.
• 3+ years programming experience
• 2+ years of Java based server side experience
• 1+ years experience with at least one of JDBC, ODBC, SOAP, REST, and OData
• 1+ years of multithreading experience
• Proficiency in both spoken and written communication in English is a must
Desired Skills
• Strong object-oriented design background
• Strong SQL & database background
• Experience developing or configuring cloud-based software
• Experience with all lifecycle aspects of enterprise software
• Experience working with large, pre-existing code bases
• Experience with enterprise security technologies
• Experience with any of the following types of data sources: Relational, Big Data, Cloud, Data
Lakes, and Enterprise Applications.
• Experience using Hive, Hadoop, Impala, Cloudera, and other Big Data technologies
Experience: 2 to 8 Yrs.
Primary Skills:
Very good knowledge in C#.NET, SQL
Should have hands on writing Unit Test cases
Experience with Application Packaging, building and deploying the application
Secondary Skills (Not Mandatory):
Knowledge in Microsoft Excel, PL/SQL, MySQL.WPF, MVVM
Knowledge in Developing Excel Plugins using Excel Interops
Minimum 3 years of experience in developing desktop applications using WPF.
Highly skilled with object oriented design, Enterprise patterns, application Tiers and layers, and effective coding practices
Experience with Microsoft Office automation tools like Word/Excel Interops
Responsibilities:
- Writing and maintaining the modules of Magento 2.X EE websites;
- Write clean, modular, robust code to implement the desired requirements with little or no supervision;
- Work with the QA and Customer Support teams to triage and fix bugs with rapid turnaround;
- Contribute ideas for making the application better and easier to use;
- Create reusable components, which can be configured for different projects;
- Create test plans and perform thorough quality analysis on the code before go live;
- Research on new integration and plug-in capabilities.
WE ARE GRAPHENE
Graphene is an award-winning AI company, developing customized insights and data solutions for corporate clients. With a focus on healthcare, consumer goods and financial services, our proprietary AI platform is disrupting market research with an approach that allows us to get into the mind of customers to a degree unprecedented in traditional market research.
Graphene was founded by corporate leaders from Microsoft and P&G and works closely with the Singapore Government & universities in creating cutting edge technology. We are gaining traction with many Fortune 500 companies globally.
Graphene has a 6-year track record of delivering financially sustainable growth and is one of the few start-ups which are self-funded, yet profitable and debt free.
We already have a strong bench strength of leaders in place. Now, we are looking to groom more talents for our expansion into the US. Join us and take both our growths to the next level!
WHAT WILL THE Full Stack Engineer DO?
- Primary Purpose: As part of a highly productive and creative AI (NLP) analytics team, design and develop web applications and SPA’s i.e., UI/UX, and underlying backend – API’s, security framework, scalable microservices etc.
- Daily Grind: Interface with the product manager, project managers, and the engineering team to achieve sprint goals on the product roadmap.
- Career Progression: Senior Full Stack Engineer, Technical Architect
YOU CAN EXPECT TO
- Work in a product-development team capable of independently authoring software products.
- Guide junior programmers, set up the architecture, and follow modular development approaches.
- Design and develop code which is well documented.
- Optimize of the application for maximum speed and scalability
- Adhere to the best Information security and Devops practices.
- Research and develop new approaches to problems.
- Design and implement schemas and databases with respect to the AI application
- Cross-pollinated with other teams.
HARD AND SOFT SKILLS
Must Have
- Problem-solving abilities
- Extremely strong programming background – data structures and algorithm
- Angular/React, Strong UI/UX skills
- Very strong python background, TDD, CI/CD,
- Software design skills i.e., OOPs, design patterns
- SQL, NoSQL – design of schemas and databases
- Microservices architecture, Cloud native apps
Good to Have
- Enterprise Design Patterns
- Distributed Systems
- Exposure to ML and Data Science
- Docker, Kubernetes, AKS, Kafka, Graph databases
● Prepare Job Descriptions
● Work on all open positions
● Learn Sourcing through Naukri
● Evaluate CVs for the role
● Conduct the telephonic interviews
● Conduct Candidate Assessments
● Coordinate Hiring Process for shortlisted candidates
● Coordinate the Offer to Joining process
● Arrange Campus Placement Drives
● Understand Staffing & Placement Agencies Onboarding
Role and Responsibilities
- Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics functionality
- Build robust RESTful APIs that serve data and insights to DataWeave and other products
- Design user interaction workflows on our products and integrating them with data APIs
- Help stabilize and scale our existing systems. Help design the next generation systems.
- Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
- Work closely with the Head of Products and UX designers to understand the product vision and design philosophy
- Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and interns.
- Constantly think scale, think automation. Measure everything. Optimize proactively.
- Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.
Skills and Requirements
- 8- 15 years of experience building and scaling APIs and web applications.
- Experience building and managing large scale data/analytics systems.
- Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices.
- Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
- Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on.
- Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
- Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
- Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
- Use the command line like a pro. Be proficient in Git and other essential software development tools.
- Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
- Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
- Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies.
- Working knowledge linux server administration as well as the AWS ecosystem is desirable.
- It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.







