11+ UCCA Jobs in Chennai | UCCA Job openings in Chennai
Apply to 11+ UCCA Jobs in Chennai on CutShort.io. Explore the latest UCCA Job opportunities across top companies like Google, Amazon & Adobe.
- This role includes both product and Project management.Involve in Data collection,Data filtering,refining, and sizing huge incidents data.
- Preferred Domain- Telecom
- Good to have Voice, Unified Communication, Contact center, Messaging experience, Application to Person (i.e. integration with WhatsApp,etc.)
- Work closely with the Engineering Team,Customer Business Team, different cross-functional Team to realize a road map of RPA implementation, Starting from requirement gathering, Ideation to Solution design and implementation
- create PDD (Process Design Document) and review SDD (Solution design document),POC development, Solutioning/Sizing/Business Case/Pricing activities
- Work closely with Marketing Team for proposal preparation,Business case development
Job Responsibilities:
- Builds business by identifying and selling prospects; maintaining relationships with clients.
- Identifies business opportunities by identifying prospects and evaluating their position in the industry; researching and analyzing sales options.
- Sells products by establishing contact and developing relationships with prospects; recommending solutions.
- Maintains relationships with clients by providing support, information, and guidance; researching and recommending new opportunities; recommending profit and service improvements.
- Identifies product improvements or new products by remaining current on industry trends, market activities, and competitors.
- Prepares reports by collecting, analyzing, and summarizing information.
- Maintains quality service by establishing and enforcing organization standards.
- Maintains professional and technical knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies.
- Contributes to team effort by accomplishing related results as needed.
Working Conditions:
- Sales engineers will drive the sales growth from their respective zones (it may be Bangalore/Chennai).
- Work timings are from 9:30 AM to 07: 30 PM with suitable time offs for lunch and snacks. (Monday to Friday)
- Travelling to prospective clients may be required for demo/trial as and when required.
Fintech Based Product & Service Organization
One of our premium client is hosting a drive for experienced React Native Developers on Saturday February 11. If you're passionate about building cutting-edge mobile apps and want to join a fast-growing tech company, this is the opportunity for you!
Shortlisted candidates will be scheduled for the weekend drive,
What they're looking for:
1. Experience with React Native, Redux, HOC, Hooks, Native Bridging, React Navigation
2. Experience in Application uploading to PlayStore and AppStore
3. Good problem-solving skills
4. Object Oriented concepts
5. Experience in working with RESTful APIS, Websockets
6. Knowledge in source code version control- Git, Subversion
- A Natural Language Processing (NLP) expert with strong computer science fundamentals and experience in working with deep learning frameworks. You will be working at the cutting edge of NLP and Machine Learning.
Roles and Responsibilities
- Work as part of a distributed team to research, build and deploy Machine Learning models for NLP.
- Mentor and coach other team members
- Evaluate the performance of NLP models and ideate on how they can be improved
- Support internal and external NLP-facing APIs
- Keep up to date on current research around NLP, Machine Learning and Deep Learning
Mandatory Requirements
- Any graduation with at least 2 years of demonstrated experience as a Data Scientist.
Behavioral Skills
Strong analytical and problem-solving capabilities.
- Proven ability to multi-task and deliver results within tight time frames
- Must have strong verbal and written communication skills
- Strong listening skills and eagerness to learn
- Strong attention to detail and the ability to work efficiently in a team as well as individually
Technical Skills
Hands-on experience with
- NLP
- Deep Learning
- Machine Learning
- Python
- Bert
Preferred Requirements
- Experience in Computer Vision is preferred
Responsibilities
- Identify and Source candidates with matching skills and experience as specified in the requirements
- Interview and qualify shortlisted candidates to confirm suitability (technical, administrative and personal), attitude, honesty and reliability
- Confirm that the chosen candidates are willing to be deputed to work at our clients’ premises
- Build confidence and trust in the candidates that they are our team and full-time employees
- Follow-up with candidates by sharing with and obtaining from the candidates all information promptly and periodically throughout the recruitment process
Location: Chennai
Experience: 3+ years
Domain: Domestic Staffing Industry
Skills: Recruitment of IT and other engineers for Telecom Industry
Geographical Coverage: Pan India
Work Hours: 6 Days Week except Second Saturday
Job Title: Java Developer
Experience: 1year to 5 years
Location: Chennai
Job Description :
- Design & development of Java applications using Java EE, Spring Boot, Database
- Experience in designing, analyzing, coding and troubleshooting large-scale distributed systems
- Ensuring continuous professional self-development
- Strong core Java skills - Multithreading, Collections, Concurrent programming
- Should have knowledge of OOPS, Design Patterns and data structure
- Should have strong understanding of databases and its core concept including stored procedure
- Ability to work in team and handle production environment, application maintenance.
at Thillais Analytical Solutions Private Limited
Company Description
At Bungee Tech, we help retailers and brands meet customers everywhere and, on every occasion, they are in. We believe that accurate, high-quality data matched with compelling market insights empowers retailers and brands to keep their customers at the center of all innovation and value they are delivering.
We provide a clear and complete omnichannel picture of their competitive landscape to retailers and brands. We collect billions of data points every day and multiple times in a day from publicly available sources. Using high-quality extraction, we uncover detailed information on products or services, which we automatically match, and then proactively track for price, promotion, and availability. Plus, anything we do not match helps to identify a new assortment opportunity.
Empowered with this unrivalled intelligence, we unlock compelling analytics and insights that once blended with verified partner data from trusted sources such as Nielsen, paints a complete, consolidated picture of the competitive landscape.
We are looking for a Big Data Engineer who will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
You will also be responsible for integrating them with the architecture used in the company.
We're working on the future. If you are seeking an environment where you can drive innovation, If you want to apply state-of-the-art software technologies to solve real world problems, If you want the satisfaction of providing visible benefit to end-users in an iterative fast paced environment, this is your opportunity.
Responsibilities
As an experienced member of the team, in this role, you will:
- Contribute to evolving the technical direction of analytical Systems and play a critical role their design and development
- You will research, design and code, troubleshoot and support. What you create is also what you own.
- Develop the next generation of automation tools for monitoring and measuring data quality, with associated user interfaces.
- Be able to broaden your technical skills and work in an environment that thrives on creativity, efficient execution, and product innovation.
BASIC QUALIFICATIONS
- Bachelor’s degree or higher in an analytical area such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.
- 5+ years relevant professional experience in Data Engineering and Business Intelligence
- 5+ years in with Advanced SQL (analytical functions), ETL, Data Warehousing.
- Strong knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures, data modeling and performance tuning.
- Ability to effectively communicate with both business and technical teams.
- Excellent coding skills in Java, Python, C++, or equivalent object-oriented programming language
- Understanding of relational and non-relational databases and basic SQL
- Proficiency with at least one of these scripting languages: Perl / Python / Ruby / shell script
PREFERRED QUALIFICATIONS
- Experience with building data pipelines from application databases.
- Experience with AWS services - S3, Redshift, Spectrum, EMR, Glue, Athena, ELK etc.
- Experience working with Data Lakes.
- Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
- Sharp problem solving skills and ability to resolve ambiguous requirements
- Experience on working with Big Data
- Knowledge and experience on working with Hive and the Hadoop ecosystem
- Knowledge of Spark
- Experience working with Data Science teams
About OpsCruise
Digital business is driving a fundamental shift to cloud-native applications, creating a new set of operational and performance challenges ill-suited to the currently available solutions. At OpsCruise, we imagine a world of autonomous operations and are innovating a fundamentally different approach to performance management. OpsCruise’s vision is to automate the performance assurance of cloud applications using a model-driven closed-loop platform.
Team
The OpsCruise team represents a global and talented team that includes domain experts in IT Operations, Networking, Storage, Hyperscale Systems and AI/ML that have built market-leading solutions at companies such as Cisco, Google, Hitachi, HP, Infoblox, Oracle and VMWare among others.
Our engineering culture values creativity, pragmatism, honesty, and simplicity to solve hard problems the right way.
Role
We are looking for a Senior QA Engineer who will join our team building and rolling out our SaaS platform in the cloud AI/Ops space.
Our Technology Stack
Our product involves the following technology areas with one or more tools in use in each area:
- Container technologies, including creating Docker plugins and extensions
- Serverless technologies including instrumentation, addons
- Orchestrators including Kubernetes, OpenShift, Mesos, Swarm
- Metric generation and collection including Prometheus and tools such as Dynatrace and Datadog
- Tracing including OpenTracing, Jaeger
- Graph tools and Databases including neo4j, JanusGraph, TinkerPop/Gremlin
- TimeSeries databases such as Prometheus, OpenTSDB
- NoSQL and Indexing tools such as MongoDB, Cassandra, Solr and Elastic
- Languages including Java, Scala, Javascript, Python, R, and Go
- Messaging tools including Kafka, Akka
- Big Data tools including HDFS, YARN, Spark, Flink
- AI/ML techniques including Statistical Analysis, Classification, Deep Learning, etc.
- Cloud services: AWS, GCP, and Azure, their services in databases, networking and ML tools
- High performance User Interfaces including AngularJS, Vue, D3.js and local stores
- Authentication and Authorization including tools such as Okta and KeyCloak.
Responsibilities
- Understand user level requirements, write up the test strategy, derive test plan and test cases in detail.
- System and integration testing with automation of test cases using Python or Java.
- Use test frameworks such as Robot Framework.
- Use traffic generator tools such as JMeter for performance testing
- Set up the target test environments in AWS, Azure or GCP
- Containers based environment setup with Kubernetes and Docker, monitoring tools setup with Prometheus
- Debug incidents and issue to narrow down to a root cause
- Understand and reproduce internally issues reported by customers
Qualifications
The ideal candidate must have the following qualifications.
- B.E/B.Tech Degree from a reputed institution with at least 4 years of relevant experience.
- Hands-on experience with test automation using Python or Java.
- Experience with test frameworks such as Robot Framework or TestNG.
- Traffic generation tools usage such as curl-loader or JMeter
- QA engineers for Networking Technology products such as switches, routers, L4-L7 products testing can also apply.
- Knowledge of public clouds, AWS, Azure or GCP, is desired
- Working knowledge in Kubernetes, Docker or Openshift environments required
- Hands on experience with Linux
- Strong problem solving and debugging abilities
- Familiarity with continuous integration tools such as Jenkins or CircleCI
- Interest in machine learning (ML) and data science is a plus
Most importantly, you should be someone who is passionate about building new and innovative products that solve tough real-world problems.
LOCATION
Chennai, India
- Looking only for immediate to 15 days candidate.
- Looking for an experienced Integration specialist with a good expertise in ETL Informatica and a strong Application integration background
- Minimum of 3+ years relevant experience in Informatica MDM required. Powercenter is a core skill set.
- Having experience in a broader Informatica toolset is strongly preferred
- Should prove a very strong implementation experience in Application integration, should demonstrate expertise/presentation with multiple use cases
- Passionate coders with a strong Application development background, years of experience could range from 5+ to 15+
- Should have application development experience outside of ETL, (just learning ETL as a tool is not enough), experience in writing application outside of ETL will bring in more value
- Strong database skills with a strong understanding of data, data quality, data governance, understanding and developing standalone and integrated database layers (sql, packages, functions, performance tuning ), i.e. Expert with a strong integration background who has more application integration background than just an ETL Informatica tool
- Experience in integration with XML/JSON based and heavily involve JMS MQ (read/write)
- Experience in SOAP and REST based API's exchanging both XML and JSON files used for request and response
- Experience with Salesforce.com Integration using Informatica power exchange module is a plus but not needed
- Experience with Informatica MDM as a technology stack that is used for integration of senior market members with Salesforce.com is a plus but not needed,
- Very strong scripting background (C/bourne shell/Perl/Java)
- Should be able to understand JAVA, we do have development around JAVA, i.e ability to work around a solution in programming language like Java when implementation is not possible through ETL
- Ability to communicate effectively via multiple channels (verbal, written, etc.) with technical and non-technical staff.