
|
Responsible for planning, connecting, designing, scheduling, and deploying data warehouse systems. Develops, monitors, and maintains ETL processes, reporting applications, and data warehouse design. |
|
Role and Responsibility |
|
· Plan, create, coordinate, and deploy data warehouses. · Design end user interface. · Create best practices for data loading and extraction. · Develop data architecture, data modeling, and ETFL mapping solutions within structured data warehouse environment. · Develop reporting applications and data warehouse consistency. · Facilitate requirements gathering using expert listening skills and develop unique simple solutions to meet the immediate and long-term needs of business customers. · Supervise design throughout implementation process. · Design and build cubes while performing custom scripts. · Develop and implement ETL routines according to the DWH design and architecture. · Support the development and validation required through the lifecycle of the DWH and Business Intelligence systems, maintain user connectivity, and provide adequate security for data warehouse. · Monitor the DWH and BI systems performance and integrity provide corrective and preventative maintenance as required. · Manage multiple projects at once. |
|
DESIRABLE SKILL SET |
|
· Experience with technologies such as MySQL, MongoDB, SQL Server 2008, as well as with newer ones like SSIS and stored procedures · Exceptional experience developing codes, testing for quality assurance, administering RDBMS, and monitoring of database · High proficiency in dimensional modeling techniques and their applications · Strong analytical, consultative, and communication skills; as well as the ability to make good judgment and work with both technical and business personnel · Several years working experience with Tableau, MicroStrategy, Information Builders, and other reporting and analytical tools · Working knowledge of SAS and R code used in data processing and modeling tasks · Strong experience with Hadoop, Impala, Pig, Hive, YARN, and other “big data” technologies such as AWS Redshift or Google Big Data
|

Similar jobs
- Perform hands-on development using Java, Spring Boot, Angular.
- Drive solution design, code quality, best practices, and performance optimization.
- Work closely with the client’s product owners and architects.
- Conduct code reviews, technical grooming, and sprint planning.
- Troubleshoot and resolve complex technical issues.
- Ensure timely delivery of modules with high quality.
- Mentor team members and support them in solving technical blockers.
Technical Skills Required
- Strong expertise in Java 8+, Spring Boot, REST APIs.
- Strong front-end experience with Angular 8+, TypeScript, HTML, CSS.
- Experience with SQL/NoSQL databases (MySQL, PostgreSQL, MongoDB, etc.).
- Hands-on with Git, Maven/Gradle, Jenkins, CI/CD.
- Knowledge of cloud platforms (AWS) is an added advantage.
- Experience with Agile/Scrum methodologies.
Assistant Manager – Marketing
Remote Opportunity
Qualifications & Experience
Education:
• Bachelor's degree in Marketing, Mass Communication, Design, or Business
• Certification in Digital Marketing (Google, Meta, HubSpot, or similar) preferred
Experience:
• 4–7 years of experience in B2B marketing, preferably in certification, training, SaaS, consulting, or services
industry
• Experience leading multi-channel campaigns and content creation
• Prior exposure to compliance, ISO standards, ESG, or cyber/information security is an added advantage
We are looking for a SAP PLM DMS Consultant with in-depth knowledge of Document Management System configuration and integration. Candidate must be proficient in DMS object links, document info records, status management, and CAD interface. Experience in managing engineering change processes is preferred.
Job Description – Full Stack Development (various levels)
"RAP - Rapid Acceleration Partners" provides practical AI solutions for digital business transformation. Intending to democratize AI, RAP has developed RAPFlow - an AI orchestration platform for building content intelligence solutions and RAPBot - an RPA tool for end-to-end automation. RAP’s vision is to provide a unified Intelligent Process Automation platform centered around Computer Vision and Natural Language Processing, combining it with RPA.
If you have the passion to be part of a fast-growing team that is geared towards redefining how IPA solutions are delivered and have that X Factor to contribute to a world class product, we have a place for you! Visit https://rapidautomation.ai/ for more details about RAP.
Position : Full Stack Developer (various levels)
Location : Chennai
Experience : 1-2 Years
Job type : Permanent
Team: You will be part of an engineering team, developing software that enables customers to automate business processes with low-code/no-code. The product(s) that you will work on have Web Applications, Web Services, AI components/services, Desktop, and Browser automation
components, Data Pipelines, Analytics, Frameworks for development/testing/CICD, and more.
Responsibilities: Full Stack Engineer, i.e.
• Work on any/all layers of existing or new products, developing end-to-end features
• Build generic or custom solutions for PoCs
• Package/deploy/support/maintain the product in production and dev/test environments
Skills - Must have:
• Full Stack Development
• React
• Node.js
• TypeScript
• Mongodb
Skills – Good to have:
• Docker
• Node-RED
• DevOps - CI/CD
• Cloud deployment, architecture, and technologies
1. Develop backend for applications in ecommerce/insurance/wealth management businesses
2. Design technically sound systems and deliver results in a fast manner
3. Building highly performant applications setting top standards in respective industries
Basic qualifications :
1. 5-7 years experience building highly performant applications in Python
2. Expertise in Python frameworks like Django
3. Familiarity with Rest of API
4. Good team handling Experience
5. Good grasp on data structures and proficiency in problem-solving
6. Knowledge of design patterns
Anyone interested send me the resume.
Data modelling, implementing business logic as well as focusing on engineering and design of the
platform.
● API design and development
● Implementation of CRUD (Create, update, read, delete)
● Writing reusable, testable, and efficient code
● Design and implementation of low-latency, high-availability, and performant applications
● Integration of user-facing elements developed by front-end developers with server-side logic
● Write and implement software solutions that integrate different systems
● Identify and suggest ways of improving efficiency and functionality
● Come up with reusable code that is efficient and easily testable
● Use backend logic to integrate user-facing features
● Development of middleware ensuring high performance and responsiveness to requests from the frontend and also development of a complex & secure data aggregation system
● Diagnose bugs and other issues in products
● Write and implement Low-Latency Applications
● Implement security and data protection
● Design and build scalable REST APIs
● Develop, test, tune for performance and deploy web services
● Work with product team to build innovative, robust, and easy-to-use features.
Collaborate with the team, optimize and refactor the back-end architecture
● Contribute to architectural and design discussions
● Ensure smooth and timely communication with both the internal and external stakeholders.
● Participate in estimations and ensure timely delivery of the features
● Design, Develop & Unit test features in the product
● Conduct peer reviews and ensure quality of committed code
Required Skills:
● Excellent software engineering skills and experience of 2-4 years.
● Solid foundation in data structures and algorithms
● Data modelling and database design. Expert in Python, with knowledge of at least one Python web
framework {such as Django, Flask, etc}
● Good understanding of server-side templating languages like DTL, Jinja 2
● Good understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
● Knowledge of at least one ETL tool or framework (Such as apache airflow)
● Familiarity with SQL and any ORM framework
● In depth knowledge of any one web server like nginx or apache2 would be a plus
● Competent in designing and building web applications and/or web services in a commercial setting
● Competent in design/implementation for reliability, availability, scalability and performance
● Working knowledge of code versioning tools such as Git
● Strong unit test and debugging skills are a plus
● Good understanding of designing micro services
Behavioural
● A self-starter attitude, the ability to work independently and in a group, demonstrated initiative, and
good writing/briefing skills are great to have
● Should be good at analytical thinking and breaking down large problems into solvable chunks
● Demonstrated the ability to think creatively and come up with ideas / thoughts with significant business/
organizational impact
● Ability to quickly adapt to changing technological trends
Job description
The role requires you to design development pipelines from the ground up, Creation of Docker Files, design and operate highly available systems in AWS Cloud environments. Also involves Configuration Management, Web Services Architectures, DevOps Implementation, Database management, Backups, and Monitoring.
Key responsibility area
- Ensure reliable operation of CI/CD pipelines
- Orchestrate the provisioning, load balancing, configuration, monitoring and billing of resources in the cloud environment in a highly automated manner
- Logging, metrics and alerting management.
- Creation of Bash/Python scripts for automation
- Performing root cause analysis for production errors.
Requirement
- 2 years experience as Team Lead.
- Good Command on kubernetes.
- Proficient in Linux Commands line and troubleshooting.
- Proficient in AWS Services. Deployment, Monitoring and troubleshooting applications in AWS.
- Hands-on experience with CI tooling preferably with Jenkins.
- Proficient in deployment using Ansible.
- Knowledge of infrastructure management tools (Infrastructure as cloud) such as terraform, AWS cloudformation etc.
- Proficient in deployment of applications behind load balancers and proxy servers such as nginx, apache.
- Scripting languages: Bash, Python, Groovy.
- Experience with Logging, Monitoring, and Alerting tools like ELK(Elastic-search, Logstash, Kibana), Nagios. Graylog, splunk Prometheus, Grafana is a plus.
Must Have:
Linux, CI/CD(Jenkin), AWS, Scripting(Bash,shell Python, Go), Ngnix, Docker.
Good to have
Configuration Management(Ansible or similar tool), Logging tool( ELK or similar), Monitoring tool(Ngios or similar), IaC(Terraform, cloudformation).In 2020, Renew Power, India’s largest renewables developer, acquired Climate Connect. Following ReNew’s listing on NASDAQ in summer 2021, Climate Connect has become the technology anchor of a new fully independent subsidiary - Climate Connect Digital. With backing from ReNew as the anchor investor to pursue an ambitious and visionary new strategy for rapid organic and inorganic growth.
Our mission has technology at its core and involves unlocking value through intelligent software, digitalisation, and ‘horizontal integration’ across the energy ecosystem. However, computational power and machine learning in the energy sector have yet to be fully leveraged and can create massive value.
We are looking for people with knowledge of:
● Excellent verbal communications, including the ability to clearly and concisely articulate complex concepts to both technical and non-technical collaborators
● Demonstrated history of knowledge in Computer Science, Statistics, Mathematics, Software Engineering or related technical fields
● Industry experience with proven ability to apply scientific methods to solve real-world problems on large scale data
● Extensive experience with Python and SQL for software development, data analysis, and machine learning
● Experience on Libraries: TensorFlow, Keras, Numpy, sklearn, pandas, scikit-image, matplotlib, Jupyter, Statsmodels
● Experience on Time Series analysis, including EDA, Statistical inferences, ARIMA, GARCH
● Knowledge of Cluster Analysis, Classification Trees, Discriminant Analysis, Neural Networks, Deep Learning, Logistic Regression, Associations Analysis
● Hands-on experience in implementing Deep learning models with video and time series data (CNN, LSTM- s, Aotoencoder, RBM)
● Experience of Regression, Multicriteria Decision Making, Descriptive Statistics, Hypothesis Testing, Segmentation/ Classification, Predictive Analytics
● Aptitude and experience in applied statistics and machine learning techniques
● Firm grasp of visualization tools interactive and self-serving such as business intelligence and notebooks
● Experience launching production-quality machine learning models at scale e.g. dataset construction, preprocessing, deployment, monitoring, quality assurance
● Experience with math programming is an added advantage. For example: optimization, computational geometry, numerical linear algebra, etc.
What you’ll work on:
We are developing a marketing automation platform through which an electricity retailer may apply a suite of proprietary ML algorithms to optimize outcomes across a range of channels and touchpoints. We require the services of a data science professional who can design and implement various AI/ML models that optimize the performance, quality, and reliability of the product. This position offers a potential pathway to leading an entire ML expert team. These are a few things you can look forward to working on:
● Translating high-level problems and key objectives into granular model requirements.
● Defining acceptance criteria that are well structured, detailed, and comprehensive.
● Developing and testing algorithms using our price forecasts, and customers' energy portfolio.
● Collaborating with the software engineering team in deploying the developed models tailored to specific customer needs.
● Participating in the software development process, and doing the required testing, and debugging to support the deployed models.
● Taking responsibility for ensuring tracking of appropriate events/metrics, so that monitoring is timely and rigorous.
● Driving the response to the discovery of regressions or failures, by undertaking various exercises (e.g. debugging, RCA, etc.) as needed
Experience:
● 6-11 years of experience in the field of Data Sciences or Machine Learning Qualifications:
● B.E / B. Tech / M. Tech / PhD in CS/IT or Data Sciences
What’s in it for you
We offer competitive salaries based on prevailing market rates. In addition to your introductory package, you can expect to receive the following benefits:
Flexible working hours
Unlimited annual leaves
Learning and development budget
Medical insurance/Term insurance, Gratuity benefits over and above the salaries
Access to industry and domain thought leaders
At Climate Connect Digital, you get a rare opportunity to join an established company at the early stages of a significant and well-backed global growth push.
Link to apply - https://climateconnect.digital/careers/?jobId=gaG9dgeTYBvF
Requirements:
Spring Boot + Google Cloud Exp 4+ Years Strong Spring 4+ Years Restful 4+ Years J2EE 4+ Years Core Java 4+ Years
- Must have hands-on experience in Java 8 or higher / J2EE
- Understanding of Agile and Lean software development processes and practices.
- Experience in Spring Boot and other spring frameworks
- Experience in developing web applications using Spring MVC
- Experience in MongoDB/ Kafka / RabbitMQ
- Microservices
- Collaborate with other team members and stakeholders
- Mandatory Skills: Core Java, Spring/Spring Boot, REST, Hibernate













