11+ VCL Jobs in Bangalore (Bengaluru) | VCL Job openings in Bangalore (Bengaluru)
Apply to 11+ VCL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest VCL Job opportunities across top companies like Google, Amazon & Adobe.
- The developer should be familiar with VCL
- Must have knowledge of LiveBindings
- Nice to know:
- ADO
- BDE
- dbExpress
- DFS
- FastReport
- Gnostice
- Indy
- QuickReport
- Raize
- TntU
- icode
- Familiar with GIT version control
- Good communication skills
Job Description
We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.
Preferred Qualifications
- Experience with microservices architecture.
- Knowledge of cloud platforms (AWS, Azure).
- Familiarity with Agile/Scrum methodologies.
- Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.
Requirment Details
Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
Proven experience as a Java Developer or similar role.
Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).
Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.
Familiarity with RESTful APIs and web services.
Understanding of version control systems (e.g., Git).
Solid understanding of object-oriented programming (OOP) principles.
Strong problem-solving skills and attention to detail.
Exp:-4-10years
Location: - Bangalore
Work Model: -Hybrid
Job Description
- Experience in Core Java 5.0 and above, CXF, Spring.
- Extensive experience in developing enterprise-scale n-tier applications for financial domain. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complex, low-latency client-facing applications.
- Good development experience with RDBMS, preferably Sybase database.
- Good knowledge of multi-threading and high-volume server-side development.
- Experience in sales and trading platforms in investment banking/capital markets.
- Basic working knowledge of Unix/Linux.
- Excellent problem solving and coding skills in Java.
- Strong interpersonal, communication and analytical skills.
- Should have the ability to express their design ideas and thoughts.
· Knowledge of RIL Automation tools : Jio Track, Tableau
· Preventive and Corrective Maintenance of Small Cell, Wifi, UBR + UBR (Air Fiber Back Haul), L2 Switch, OSP / Common
· Preventive Maintenance of
o IBS (Indoor Small Cell/WiFi/UBR (P2P/P2MP)) : Battery & SMPS, Electrical Panel, Energy Readings of energy meters, Earthing & Power System, Alarm extension to NOC, Security Alarm System testing.
o Small Cell / L2 Switch / UBR All Type / WiFi : Power/CIPRI cable, GPS Connector, Antenna & LOS alignment, etc
o SMPS Maintenance : Rectifier Ventilation cleaning, LVD/BLVD setting check, Check All Parameters visible on NOC, Hardware Replacement - Rectifier, Controller, Communication
o Battery Maintenance : Battery Discharge test, Voltage & abnormal temperature, Earthing & Power connection.
o Earthing Maintenance : Earth pit Measurement and Maintenance, Record Voltage between Neutral & Earthing, Check earthing bonding at all points at site.
o Overall hygiene of site
· Replacing Faulty cards with Healthy Spares at site - Bring faulty Module from Site to CMP .
· Providing data for Consumption booking in SAP - Related tasks as instructed by NOC Engineers Small cells, WIFI, IP, SMPS, Battery, Earthing & UBR Maintenance.
· Should be well versed with basic Microsoft Office.
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Note: Candidates from Premium Institutions are preferred
Job Description
- Should be proficient with core object oriented JavaScript
- Ability to build full stack application using React/Node/Mongo or any other DB
- React Js
- Redux
- Bootstrap & Responsive design concept with HTML5/CSS3 ideas
- npm/nvm concept and hands on experience on usage
- Git as version control tool
- Clear idea of REST API and ability to read API docs and integrate independently
About Senrya Technologies Pvt.Ltd. –
We are a full-stack avant-garde product organization with its roots in East India (Kolkata) and expanse in North East, Maharashtra, Gurgaon, Karnataka and has made Kolkata from backwaters of IT to a live-wire hub of innovation. We blend innovation and next-gen technologies with rich experience of more than a decade and undaunted ambition with the primary objective to build highly scalable solutions that will impact a billion lives.
Senrysa group of companies includes three business verticals that contribute to our success story - Fintech, Retailtech, Healthtech. The growth trajectory of Senrysa has been 100% every passing year.
Fintech - We are one of the leading Fintech organization in financial inclusion with innovative ICT solutions. Senrysa Technologies was an early adopter of the India Stack and has pioneered the Aadhaar Enabled Payment System under the flagship of the business correspondent model for masses and has acquired more than 30 million customers till date under this initiative. We take pride in been granted RuPay Certification from NPCI. Senrysa has diversified client base from Public Sector Bank, Private Bank, Regional Rural Bank to Co-operative Banks across India.
Retailtech - NDHGO is an AI-enabled B2B online ordering platform that empowers small and medium businesses to go online in less than a minute thereby offering unorganized retail players a level playing field to compete with organized aggregators. It will also enable them to sell and market their product offerings online with zero commission. We aim to develop a sustainable online ecosystem for the stores and make their presence digitally discernible and accelerate customer engagement with a mobile first approach and simplicity of adoption where retailers can list their products from the NDHGO Catalogue Builder and share the link to their store with customers through WhatsApp and trending Social Media. Please refer attached presentation for a quick overview.
Healthtech
Understanding the dire need for new practices and procedures and to ensure that quality and timely healthcare reaches every corner of India, we are into Healthcare. We are developing the most advanced IoT based telemedicine devices, this would advent smart health practices that would help improve quality, increase access and affordability.
Benefits:
Competitive Package
Esops
Retention Bonus
Joining Bonus
Relocation Bonus
Higly skilled team members with vast knowledge & expertise
2. Making International calls to Top brands across the globe
3. Sending Sales proposals to client and closing sales.
About the Role
A highly motivated and passionate individual who has experience in executing end to end web based products and applications; bringing them to production quickly and with high quality. Passionate about building flexible and scalable solutions with an eye for detail and can weigh pros and cons to find best possible solutions.
Role and Responsibilities
- Collaborate with Product Managers to plan, implement and deliver tasks and modules on a timely basis with best practices and adherence to SOPs.
- Understand the product requirements, ask questions, and gather information and feedback to design and deliver features both on Android and iOS.
- Create a roadmap of tasks to be delivered for both iOS and Android applications in sync, such that new features go to the end-user at same time.
- Lead the design of Android and iOS applications in a modular fashion with reusable components.
- Proactively identify issues related to memory consumption, battery drain and multi-threading in the application by planning regular tests and analysis.
- Deliver Android and iOS applications with integration to backend services deployed on the cloud with high quality and responsiveness.
- Create strong practices around test driven development, automating delivery of apps to the marketplace with strong CI/CD practices.
- Manage a highly skilled and efficient team by hiring, keeping the team motivated, and managing performance.
- Play the role of an unblocker in a tight scrum environment. Should be able to help other developers with challenges, problem solving, and help achieve milestones as per plan.
- Prioritize to manage ad-hoc requests in parallel with ongoing projects.
- Hands on with programming with 50%+ of bandwidth going into modules owned.
Skills/Experience
- A highly talented developer with 10+ years of hands-on experience in building apps that have been released to the Play Store(Android) and Appstore(iOS).
- Demonstrated experience managing teams of 5-10 or more engineers in mobile application teams.
- Strong knowledge of Android, Kotlin, iOS, Swift, Objective C and working with MVVM, MVP, MVC patterns.
- Experience building mobile applications which lean heavily on connecting to cloud services to gather data and stream videos/content.
- Working knowledge of Xamarin or Flutter (cross platform frameworks) is highly preferred.
- Practitioner of test driven development practices. Experience in creating and adhering to best practices for development.



