
Senior Manager- Data Intelligence|
at Data Intelligence Platform For Decision-Makers
Supporting today’s data driven business world, our client acts as a full-stack data intelligence platform which leverages granular and deep data from various sources, thus helping the decision-makers at the executive level. Their solutions include supply chain optimization, building footprints, track construction hotspots, real estate, and lots more.
Their work embeds geospatial analytics, location intelligence and predictive modelling in the foundations of economic modelling and evaluation theory to build data intelligence layers for their clients which include governments, multilateral institutions, and private organizations.
Headquartered in New Delhi, our client works as a team of economists, data scientists, geo-spatial analysts, etc. Their decision-support system includes Big-Data, predictive modeling, forecasting, socio economic dataset and many more.
As a Senior Manager– Data Intelligence, you will be responsible for contributing to all stages of projects– conceptualizing, preparing work plans, overseeing analytical work, driving teams to meet targets and ensuring quality standards.
What you will do:
- Thoroughly understanding the data processing pipeline and troubleshooting/problem-solving the technical team
- Acting as SPOC for client communications across the portfolio of projects undertaken by the organization
- Contributing to different aspects of organizational growth– team building, process building, strategy and business development
Desired Candidate Profile
What you need to have:- Post-graduate degree in relevant subjects such as Economics/ Engineering/ Quantitative Social Sciences/ Management etc
- At least 5 years of relevant work experience
- Vast experience in managing multiple client projects
- Strong data/quantitative analysis skills are a must
- Prior experience working in data analytics teams
- Credible experience of data platforms & languages in the past

Similar jobs
Job Details
- Job Title: Lead I - Business Analysis
- Industry: Global Digital Transformation Solutions Provider
- Domain- Information technology (IT)
- Experience Required: 5-7 years
- Employment Type: Full Time
- Job Location: Trivandrum, Thiruvananthapuram
- CTC Range: Best in Industry
Job Description:
5+ years of experience in RPA business analysis Degree in Business Management, Information Technology, Financial Management or similar relevant field
Excellent organizational and time management skills
Outstanding communication and presentation skills
Should have Critical thinking,
Creative problem-solving skills and analytical skills
Should be Data-driven Experience in different development methodologies (Agile, Lean Product Development, Waterfall)
Must have capability to handle multiple projects simultaneously and independently
Good understanding on RPA and automation tools in the industry
Expertise in establishing processes and practice assets for large teams
Excellent written and verbal communication skills
Skills: Business analysis, agile, business process modeling, business case development,
Must-Haves
RPA business analysis (5+ years), Agile development methodology, Business process modeling, Business case development, Automation tools expertise
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Trivandrum
3 days in office, Hybrid model.
At Palcode.ai, We're on a mission to fix the massive inefficiencies in pre-construction. Think about it - in a $10 trillion industry, estimators still spend weeks analyzing bids, project managers struggle with scattered data, and costly mistakes slip through complex contracts. We're fixing this with purpose-built AI agents that work. Our platform can do “magic” to Preconstruction workflows from Weeks to Hours. It's not just about AI – it's about bringing real, measurable impact to an industry ready for change. We are backed by names like AWS for Startups, Upekkha Accelerator, and Microsoft for Startups.
Why Palcode.ai
Tackle Complex Problems: Build AI that reads between the lines of construction bids, spots hidden risks in contracts, and makes sense of fragmented project data
High-Impact Code: Your code won't sit in a backlog – it goes straight to estimators and project managers who need it yesterday
Tech Challenges That Matter: Design systems that process thousands of construction documents, handle real-time pricing data, and make intelligent decisions
Build & Own: Shape our entire tech stack, from data processing pipelines to AI model deployment
Quick Impact: Small team, huge responsibility. Your solutions directly impact project decisions worth millions
Learn & Grow: Master the intersection of AI, cloud architecture, and construction tech while working with founders who've built and scaled construction software
Your Role:
- Design and build our core AI services and APIs using Python
- Create reliable, scalable backend systems that handle complex data
- Help set up cloud infrastructure and deployment pipelines
- Collaborate with our AI team to integrate machine learning models
- Write clean, tested, production-ready code
You'll fit right in if:
- 6 months of live project experience on Python development experience
- Minimum of 1 full-time internship for Python > 3 months
- You're comfortable with full-stack development and cloud services
- You write clean, maintainable code and follow good engineering practices
- You're curious about AI/ML and eager to learn new technologies
- You enjoy fast-paced startup environments and take ownership of your work
How we will set you up for success
- You will work closely with the Founding team to understand what we are building.
- You will be given comprehensive training about the tech stack, with an opportunity to avail virtual training as well.
- You will be involved in a monthly one-on-one with the founders to discuss feedback
- A unique opportunity to learn from the best - we are Gold partners of AWS, Razorpay, and Microsoft Startup programs, having access to rich talent to discuss and brainstorm ideas.
- You’ll have a lot of creative freedom to execute new ideas. As long as you can convince us, and you’re confident in your skills, we’re here to back you in your execution.
Location: Remote
Compensation: 15K -18K for 4 -6 months of Probation Period
2.5 L - 3 LPA after Probation
If you get excited about solving hard problems that have real-world impact, we should talk.
All the best!!
Read less
Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.
Need a strong understanding of the development lifecycle, and possess debugging skills time management, business acumen, and have a positive attitude is a must and open to continual growth.
Capability to code appropriate solutions will be tested in any interview.
Knowledge of a wide variety of Generative AI models
Conceptual understanding of how large language models work
Proficiency in coding languages for data manipulation (e.g., SQL) and machine learning & AI development (e.g., Python)
Experience with dashboarding tools such as Power BI and Tableau (beneficial but not essential)
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
Java/J2EE stack
• Design and develop RDandX Network’s microservices and ensure bug free code is pushed to
the deployment pipeline to support large volume of transactions
• Define and communicate the technical design requirements to the Network’s stakeholders
and the Engineering lead
• Responsible for building restful services to integrate with third party services like AdWords
and Facebook marketing API
• Responsible for designing the technical architecture of the different services and
maintaining and upgrading it
• Designing the unit test cases and building the framework for the development team to
enforce the unit testing in all the services
• Be involved and participate in the end to end products’ lifecycle management
• Learn about new technologies and Stay up to date with best practices
• Collaborate with multidisciplinary team of designers, engineers, system administrators and
product team
• Lead the Backend team and manage their day to day activities and work deliverables
Responsibilities:
. Create exclusive distributors, super stockiest and business associates for the company.
.Guide them and coordinate them to achieve the target towards the entire growth of the company.
.Finding and developing new marketing ideas and improving sales.
. Observe all the stock reports, product orders, re-order etc.
Requirements:
.Bachelor's degree and experience in business, marketing or related field.
.Strong communication skills.
. Basic knowledge of Word, excel, and PowerPoint.
.Minimum age is 30 and the maximum is 60.
Come Dive In
The DevOps Engineer will execute the tools and processes to enable DevOps.
Engage in and improve the whole lifecycle of services from inception and design through deployment, operation, and refinement to efficiently deliver high-quality solutions. The candidate should bridge the gap between Development and Operational teams, working with the development teams to meet acceptance criteria and gather and document the requirements. Candidates should be able to work in fast-paced,
multi-disciplinary environments.
As An DevOps Engineer, You Will
● Work in a dynamic, agile team environment developing excellent new applications.
● Participate in design decisions, including new technology research and prototyping
● Collaborate closely with other AWS engineers and architects, cloud engineers, support teams, and other stakeholders
● Promote great Kubernetes and AWS platform design and quality
● Innovate new ideas to evolve our applications and processes
● Continuously analyzing and evaluating our systems, products, and methods for potential improvements.
Mandatory Skills:
● Experience on Linux based infrastructure
● Experience in ECS - Amazon services*
● Should have hands-on containerized Services
● Must know about AWS CI/CD pipeline.
● Must know DevOps concepts and Agile principles
● Knowledge of Git, Docker, and Jenkins
● Knowledge of Infrastructure as Code.
● Experience in using Automation Tools
● Must have experience in Test Driven Development environment setup.
● Working knowledge of Docker and Kubernetes
We recognize that asking you to give 100% of yourself daily requires us to show you the love.
PERKS: what can we offer you?
● Bi-Yearly performance audits and appraisals
● The flexibility of working days/hours
● 5 working days/week (Mon to Fri) and added payout for working Saturday
● Recognition and Appreciation
● A plethora of industry exposure and self-growth opportunities
Visit our site: www.cedcoss.com
- Min 2 end to end implementations in SAP Enterprise Portal.
- Working knowledge on SAP portal system administration, content admin and user management, creation of Roles, iViews etc.
- Familiarity in NWDS/NWDI (Different perspectives and their use) and NWDI Track creation.
- SAP Enterprise Portal (PCD, KM, UWL and System Objects).
- Experience in WebDynpro ABAP and JAVA application.










