11+ Underwriting Jobs in Chennai | Underwriting Job openings in Chennai
Apply to 11+ Underwriting Jobs in Chennai on CutShort.io. Explore the latest Underwriting Job opportunities across top companies like Google, Amazon & Adobe.
Purpose
· The Analytics and BI Head will collaborate closely with leaders across product, sales, and marketing to support and implement high-quality, data-driven decisions.
· The candidate will ensure data accuracy and consistent reporting by designing optimal processes and procedures for analytical managers and employees to create and follow.
· The candidate will manage the processes and people responsible for correct data reporting, modeling, and analysis.
Key Responsibilities
Responsibilities will include but will not be restricted to:
· Lead cross-functional projects involving advanced data modeling and analysis techniques and review insights that will guide strategic decisions and uncover optimization opportunities.
· Managing project budgets and financials, including forecasting, monitoring, and reporting.
· Providing clear and concise instructions to different teams to ensure quality delivery of analysis.
· Maintain dashboards and performance metrics support that support key business decisions.
· Planning and executing strategies for completing projects on time.
· Determining the need for training and talent development.
· Recruit, train, develop and supervise managerial level employees.
· Examine, interpret, and report results of analytical initiatives to stakeholders in leadership, technology, sales, marketing, and product teams.
· Oversee the deliverable pipeline, including rationalization, de-duplication, and prioritization.
· Anticipate future demands of initiatives related to people, technology, budget and business across the departments and review solutions to meet these needs.
· Communicate results and business impacts of insight initiatives to stakeholders within and outside of the company.
Technical Requirements (what)
· Advanced knowledge of data analysis and modeling principles: KPI Tree creation, Reporting best practices, predictive analytics, statistical and ML based modeling techniques
· Work Experience of 20+ years in analytics inclusive of 5+ years of experience of leading a team. Candidates from Insurance Industry are preferred.
· Understanding of and experience using analytical concepts and statistical techniques: hypothesis development, designing tests/experiments, analyzing data, drawing conclusions, and developing actionable recommendations for business units.
· Strong problem solving, quantitative and analytical abilities.
· Strong ability to plan and manage numerous processes, people, and projects simultaneously.
· The right candidate will also be proficient and experienced with the following tools/programs:
§ Experience with working on big data environments: Teradata, Aster, Hadoop.
§ Experience with testing tools such as Adobe Test & Target.
§ Experience with data visualization tools: Tableau, Raw, chart.js.
§ Experience with Adobe Analytics and other analytics tools.
Desired Personal Qualities or Behavior
· Strong leadership qualities, ability to communicate the vision crisply and drive collaboration and innovation.
· Must be an initiative-taker.
· Strong diligence, organizational skills, and the ability to manage multiple projects
· Strong written and verbal communication skills, able to communicate effectively and in a professional manner with all levels of the company and outside vendors.
· Ability to work in a diverse team environment and effectively support the demanding needs of the company.
· Ability to work under pressure, meet deadlines with shifting priorities.
Skills and requirements
- Experience analyzing complex and varied data in a commercial or academic setting.
- Desire to solve new and complex problems every day.
- Excellent ability to communicate scientific results to both technical and non-technical team members.
Desirable
- A degree in a numerically focused discipline such as, Maths, Physics, Chemistry, Engineering or Biological Sciences..
- Hands on experience on Python, Pyspark, SQL
- Hands on experience on building End to End Data Pipelines.
- Hands on Experience on Azure Data Factory, Azure Data Bricks, Data Lake - added advantage
- Hands on Experience in building data pipelines.
- Experience with Bigdata Tools, Hadoop, Hive, Sqoop, Spark, SparkSQL
- Experience with SQL or NoSQL databases for the purposes of data retrieval and management.
- Experience in data warehousing and business intelligence tools, techniques and technology, as well as experience in diving deep on data analysis or technical issues to come up with effective solutions.
- BS degree in math, statistics, computer science or equivalent technical field.
- Experience in data mining structured and unstructured data (SQL, ETL, data warehouse, Machine Learning etc.) in a business environment with large-scale, complex data sets.
- Proven ability to look at solutions in unconventional ways. Sees opportunities to innovate and can lead the way.
- Willing to learn and work on Data Science, ML, AI.
Job Location: Chennai
Job Summary
The Engineering team is seeking a Data Architect. As a Data Architect, you will drive a
Data Architecture strategy across various Data Lake platforms. You will help develop
reference architecture and roadmaps to build highly available, scalable and distributed
data platforms using cloud based solutions to process high volume, high velocity and
wide variety of structured and unstructured data. This role is also responsible for driving
innovation, prototyping, and recommending solutions. Above all, you will influence how
users interact with Conde Nast’s industry-leading journalism.
Primary Responsibilities
Data Architect is responsible for
• Demonstrated technology and personal leadership experience in architecting,
designing, and building highly scalable solutions and products.
• Enterprise scale expertise in data management best practices such as data integration,
data security, data warehousing, metadata management and data quality.
• Extensive knowledge and experience in architecting modern data integration
frameworks, highly scalable distributed systems using open source and emerging data
architecture designs/patterns.
• Experience building external cloud (e.g. GCP, AWS) data applications and capabilities is
highly desirable.
• Expert ability to evaluate, prototype and recommend data solutions and vendor
technologies and platforms.
• Proven experience in relational, NoSQL, ELT/ETL technologies and in-memory
databases.
• Experience with DevOps, Continuous Integration and Continuous Delivery technologies
is desirable.
• This role requires 15+ years of data solution architecture, design and development
delivery experience.
• Solid experience in Agile methodologies (Kanban and SCRUM)
Required Skills
• Very Strong Experience in building Large Scale High Performance Data Platforms.
• Passionate about technology and delivering solutions for difficult and intricate
problems. Current on Relational Databases and No sql databases on cloud.
• Proven leadership skills, demonstrated ability to mentor, influence and partner with
cross teams to deliver scalable robust solutions..
• Mastery of relational database, NoSQL, ETL (such as Informatica, Datastage etc) /ELT
and data integration technologies.
• Experience in any one of Object Oriented Programming (Java, Scala, Python) and
Spark.
• Creative view of markets and technologies combined with a passion to create the
future.
• Knowledge on cloud based Distributed/Hybrid data-warehousing solutions and Data
Lake knowledge is mandate.
• Good understanding of emerging technologies and its applications.
• Understanding of code versioning tools such as GitHub, SVN, CVS etc.
• Understanding of Hadoop Architecture and Hive SQL
• Knowledge in any one of the workflow orchestration
• Understanding of Agile framework and delivery
•
Preferred Skills:
● Experience in AWS and EMR would be a plus
● Exposure in Workflow Orchestration like Airflow is a plus
● Exposure in any one of the NoSQL database would be a plus
● Experience in Databricks along with PySpark/Spark SQL would be a plus
● Experience with the Digital Media and Publishing domain would be a
plus
● Understanding of Digital web events, ad streams, context models
About Condé Nast
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and social
platforms - in other words, a staggering amount of user data. Condé Nast made the right
move to invest heavily in understanding this data and formed a whole new Data team
entirely dedicated to data processing, engineering, analytics, and visualization. This team
helps drive engagement, fuel process innovation, further content enrichment, and increase
market revenue. The Data team aimed to create a company culture where data was the
common language and facilitate an environment where insights shared in real-time could
improve performance.
The Global Data team operates out of Los Angeles, New York, Chennai, and London. The
team at Condé Nast Chennai works extensively with data to amplify its brands' digital
capabilities and boost online revenue. We are broadly divided into four groups, Data
Intelligence, Data Engineering, Data Science, and Operations (including Product and
Marketing Ops, Client Services) along with Data Strategy and monetization. The teams built
capabilities and products to create data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create diverse
forms of self-expression. At Condé Nast, we encourage the imaginative and celebrate the
extraordinary. We are a media company for the future, with a remarkable past. We are
Condé Nast, and It Starts Here.
Job Description:
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- At least 2 years of experience in computer vision and or deep learning for object detection and tracking along with semantic or instance segmentation either in the academic or industrial domain.
- Experience with any machine deep learning frameworks like Tensorflow, Keras, Scikit-Learn and PyTorch.
- Experience in training models through GPU computing using NVIDIA CUDA or on the cloud.
- Ability to transform research articles into working solutions to solve real-world problems.
- Strong experience in using both basic and advanced image processing algorithms for feature engineering.
- Proficiency in Python and related packages like numpy, scikit-image, PIL, opencv, matplotlib, seaborn, etc.
- Excellent written and verbal communication skills for effectively communicating with the team and ability to present information to a varied technical and non-technical audiences.
- Must be able to produce solutions independently in an organized manner and also be able to work in a team when required.
- Must have good Object-Oriented Programing & logical analysis skills in Python.
· 10+ years of Information Technology experience, preferably with Telecom / wireless service providers. · Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation
· To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc · The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers · Demonstrated ability to work collaboratively · Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.) · Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision · Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems |
Position: Big Data Engineer
What You'll Do
Punchh is seeking to hire Big Data Engineer at either a senior or tech lead level. Reporting to the Director of Big Data, he/she will play a critical role in leading Punchh’s big data innovations. By leveraging prior industrial experience in big data, he/she will help create cutting-edge data and analytics products for Punchh’s business partners.
This role requires close collaborations with data, engineering, and product organizations. His/her job functions include
- Work with large data sets and implement sophisticated data pipelines with both structured and structured data.
- Collaborate with stakeholders to design scalable solutions.
- Manage and optimize our internal data pipeline that supports marketing, customer success and data science to name a few.
- A technical leader of Punchh’s big data platform that supports AI and BI products.
- Work with infra and operations team to monitor and optimize existing infrastructure
- Occasional business travels are required.
What You'll Need
- 5+ years of experience as a Big Data engineering professional, developing scalable big data solutions.
- Advanced degree in computer science, engineering or other related fields.
- Demonstrated strength in data modeling, data warehousing and SQL.
- Extensive knowledge with cloud technologies, e.g. AWS and Azure.
- Excellent software engineering background. High familiarity with software development life cycle. Familiarity with GitHub/Airflow.
- Advanced knowledge of big data technologies, such as programming language (Python, Java), relational (Postgres, mysql), NoSQL (Mongodb), Hadoop (EMR) and streaming (Kafka, Spark).
- Strong problem solving skills with demonstrated rigor in building and maintaining a complex data pipeline.
- Exceptional communication skills and ability to articulate a complex concept with thoughtful, actionable recommendations.
- 4-7 years of Industry experience in IT or consulting organizations
- 3+ years of experience defining and delivering Informatica Cloud Data Integration & Application Integration enterprise applications in lead developer role
- Must have working knowledge on integrating with Salesforce, Oracle DB, JIRA Cloud
- Must have working scripting knowledge (windows or Nodejs)
Soft Skills
- Superb interpersonal skills, both written and verbal, in order to effectively develop materials that are appropriate for variety of audience in business & technical teams
- Strong presentation skills, successfully present and defend point of view to Business & IT audiences
- Excellent analysis skills and ability to rapidly learn and take advantage of new concepts, business models, and technologies
Hi All,
We are hiring Data Engineer for one of our client for Bangalore & Chennai Location.
Strong Knowledge of SCCM, App V, and Intune infrastructure.
Powershell/VBScript/Python,
Windows Installer
Knowledge of Windows 10 registry
Application Repackaging
Application Sequencing with App-v
Deploying and troubleshooting applications, packages, and Task Sequences.
Security patch deployment and remediation
Windows operating system patching and defender updates
Thanks,
Mohan.G
About Toyota Connected
If you want to change the way the world works, transform the automotive industry and positively impact others on a global scale, then Toyota Connected is the right place for you! Within our collaborative, fast-paced environment we focus on continual improvement and work in a highly iterative way to deliver exceptional value in the form of connected products and services that wow and delight our customers and the world around us.
About the Team
Toyota Connected India is hiring talented engineers at Chennai to use Deep Learning, Computer vision, Big data, high performance cloud-based services and other cutting-edge technologies to transform the customer experience with their vehicle. Come help us re-imagine what mobility can be today and for years to come!
Job Description
The Toyota Connected team is looking for a Senior ML Engineer (Computer Vision) to be a part of a highly talented engineering team to help create new products and services from the ground up for the next generation connected vehicles. We are looking for team members that are required to be creative in solving problems, excited to work in new technology areas and be ready to wear multiple hats to get things done in a highly energized, fast-paced, innovative and collaborative startup environment.
What you will do
• Develop solutions using Machine Learning/Deep Learning and other advanced technologies to solve a variety of problems
• Develop image analysis algorithms and deep learning architectures to solve Computer vision related problems.
• Implement cutting edge machine learning techniques in image classification, object detection, semantic segmentation, sequence modeling, etc. using frameworks such as OpenCV, TensorFlow and Pytorch.
• Translate user stories and business requirements to technical solutions by building quick prototypes or proof of concepts with several business and technical stakeholder groups in both internal and external organizations
• Partner with leaders in the area and have insights to select off the shelf components vs building from the scratch
• Convert the proof of concepts to production-grade solutions that can scale for hundreds of thousands of users
• Be hands-on where required and lead from the front in following best practices in development and CI/CD methods
• Own delivery of features from top to bottom, from concept to code to production
• Develop tools and libraries that will enable rapid and scalable development in the future
You are a successful candidate if
• You are smart and can demonstrate it
• You have 8+ years of experience as software engineer with minimum 3 years hands-on experience delivering products or solutions that utilized Computer Vision
• Strong experience in deploying solutions to production, with hands-on experience in any public cloud environment (AWS, GCP or Azure)
• Excellent proficiency in Open CV or related computer vision frameworks and libraries
• Mathematical understanding of a variety of statistical learning algorithms (Reinforcement Learning, Supervised/Unsupervised, Graphical Models)
• Expertise in a variety of Deep Learning architectures including Residual Networks, RNN/CNN, Transformer, and Transfer Learning. And experience in delivering value using these in real production environments for real customers
• You have deep proficiency in Python and at least one other major programming language (C++, Java, Golang)
• You are very fluent in one or more ML tools/libraries like Tensorflow, Pytorch, Caffe, and/or Theano and have solved several real-life problems using these
• We think the knowledge acquired earning a degree in Computer Science or Math would be of great value in this position, but if you're smart and have the experience that backs up your abilities, for us, talent trumps degree every time
What is in it for you?
• Top of the line compensation!
• You'll be treated like the professional we know you are and left to manage your own time and work load.
• Yearly gym membership reimbursement. & Free catered lunches.
• No dress code! We trust you are responsible enough to choose what’s appropriate to wear for the day.
• Opportunity to build products that improves the safety and convenience of millions of customers.
Our Core Values
- Empathetic: We begin making decisions by looking at the world from the perspective of our customers, teammates, and partners.
- Passionate: We are here to build something great, not just for the money. We are always looking to improve the experience of our millions of customers
- Innovative: We experiment with ideas to get to the best solution. Any constraint is a challenge, and we love looking for creative ways to solve them.
- Collaborative: When it comes to people, we think the whole is greater than its parts and that everyone has a role to play in the success!
ML ARCHITECT
Job Overview
We are looking for a ML Architect to help us discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be in applying data mining techniques, doing statistical analysis, and building high quality prediction systems integrated with our products. They must have strong experience using variety of data mining and data analysis methods, building and implementing models, using/creating algorithm’s and creating/running simulations. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Automating to identify the textual data with their properties and structure form various type of document.
Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state-of-the-art methods
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Creating automated anomaly detection systems and constant tracking of its performance
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Secure and manage when needed GPU cluster resources for events
- Write comprehensive internal feedback reports and find opportunities for improvements
- Manage GPU instances/machines to increase the performance and efficiency of the ML/DL model.
Skills and Qualifications
- Strong Hands-on experience in Python Programming
- Working experience with Computer Vision models - Object Detection Model, Image Classification
- Good experience in feature extraction, feature selection techniques and transfer learning
- Working Experience in building deep learning NLP Models for text classification, image analytics-CNN,RNN,LSTM.
- Working Experience in any of the AWS/GCP cloud platforms, exposure in fetching data from various sources.
- Good experience in exploratory data analysis, data visualisation, and other data preprocessing techniques.
- Knowledge in any one of the DL frameworks like Tensorflow, Pytorch, Keras, Caffe
- Good knowledge in statistics,distribution of data and in supervised and unsupervised machine learning algorithms.
- Exposure to OpenCV Familiarity with GPUs + CUDA
- Experience with NVIDIA software for cluster management and provisioning such as nvsm, dcgm and DeepOps.
- We are looking for a candidate with 14+ years of experience, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with AWS cloud services: EC2, RDS, AWS-Sagemaker(Added advantage)
- Experience with object-oriented/object function scripting languages in any: Python, Java, C++, Scala, etc.