
Software Architect/CTO
at Blenheim Chalcot IT Services India Pvt Ltd


Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.

Similar jobs
Job Profile
Profile: Audit & Taxation
Qualification: Semi qualified CA(drop out)
Years of Experience: 5-6 years
Location: Lower Parel
Salary: 50000-60000
Qualification: Semi qualified CA/
Roles & Responsibilities:
Having Experience in Statutory Audits of Companies & Banks.
Having Domain knowledge of GST, ability to develop team and lead Indirect Tax Practice and Work under Pressure
Proficiency in Taxation Laws, Accounting Practices, Audit
Major experinec in Direct Taxation
Indirect Taxation
Company Audit, Income Tax Audit, Other Statutory Audit Individual Taxation, Direct and Indirect Tax Return filling and General Accounting
Desired profile of the candidate
Having knowledge of Company Audit, Income Tax Audit, Other Statutory Audit Individual Taxation, Direct and Indirect Tax Return filling and General Accounting
Skills:
Excel, MS Office
Working Days -Time
6 Days (mon-Sat) 10-7
Recruiter Instructions
Hi Kirti,
Job Title: Data Analytics Engineer
Experience: 3 to 6 years
Location: Gurgaon (Hybrid)
Employment Type: Full-time
Job Description:
We are seeking a highly skilled Data Analytics Engineer with expertise in Qlik Replicate, Qlik Compose, and Data Warehousing to build and maintain robust data pipelines. The ideal candidate will have hands-on experience with Change Data Capture (CDC) pipelines from various sources, an understanding of Bronze, Silver, and Gold data layers, SQL querying for data warehouses like Amazon Redshift, and experience with Data Lakes using S3. A foundational understanding of Apache Parquet and Python is also desirable.
Key Responsibilities:
1. Data Pipeline Development & Maintenance
- Design, develop, and maintain ETL/ELT pipelines using Qlik Replicate and Qlik Compose.
- Ensure seamless data replication and transformation across multiple systems.
- Implement and optimize CDC-based data pipelines from various source systems.
2. Data Layering & Warehouse Management
- Implement Bronze, Silver, and Gold layer architectures to optimize data workflows.
- Design and manage data pipelines for structured and unstructured data.
- Ensure data integrity and quality within Redshift and other analytical data stores.
3. Database Management & SQL Development
- Write, optimize, and troubleshoot complex SQL queries for data warehouses like Redshift.
- Design and implement data models that support business intelligence and analytics use cases.
4. Data Lakes & Storage Optimization
- Work with AWS S3-based Data Lakes to store and manage large-scale datasets.
- Optimize data ingestion and retrieval using Apache Parquet.
5. Data Integration & Automation
- Integrate diverse data sources into a centralized analytics platform.
- Automate workflows to improve efficiency and reduce manual effort.
- Leverage Python for scripting, automation, and data manipulation where necessary.
6. Performance Optimization & Monitoring
- Monitor data pipelines for failures and implement recovery strategies.
- Optimize data flows for better performance, scalability, and cost-effectiveness.
- Troubleshoot and resolve ETL and data replication issues proactively.
Technical Expertise Required:
- 3 to 6 years of experience in Data Engineering, ETL Development, or related roles.
- Hands-on experience with Qlik Replicate & Qlik Compose for data integration.
- Strong SQL expertise, with experience in writing and optimizing queries for Redshift.
- Experience working with Bronze, Silver, and Gold layer architectures.
- Knowledge of Change Data Capture (CDC) pipelines from multiple sources.
- Experience working with AWS S3 Data Lakes.
- Experience working with Apache Parquet for data storage optimization.
- Basic understanding of Python for automation and data processing.
- Experience in cloud-based data architectures (AWS, Azure, GCP) is a plus.
- Strong analytical and problem-solving skills.
- Ability to work in a fast-paced, agile environment.
Preferred Qualifications:
- Experience in performance tuning and cost optimization in Redshift.
- Familiarity with big data technologies such as Spark or Hadoop.
- Understanding of data governance and security best practices.
- Exposure to data visualization tools such as Qlik Sense, Tableau, or Power BI.

- Work closely with product managers and engineers to design, implement, test and continually improve scalable frontend and backend services.
- Develop products using agile methods and tools.
- Develop commercial grade software that is user friendly and suitable for a global audience.
- Plan, create and execute (manual and automated) tests.
- Be involved and participate in the overall application lifecycle.
- Building reusable code and libraries for future use.
- Staying up to date with current technologies and providing insights on cutting edge software approaches, architectures, and vendors.
- Fluency in any one of JavaScript, TypeScript or Python.
- Strong problem solving skills.
- Should have built large scalable enterprise applications from scratch.
- Strong experience in architectural patterns, High level designs.
- Experience in Nosql and SQL DBs.
- Bachelor's degree in finance or related field.
- 2+ years of experience in mortgage processing.
- Excellent mathematical and statistical aptitude.
- Analytical mindset.
- Good written and verbal communication.
- Good organizational skills.
- Exceptional interpersonal skills.
- Attention to detail.

|

We are looking for an enthusiastic and passionate Web Developer Intern with a proven track record of building Web apps from the ground up using ReactJs.
Skills Required
- 1-3 Years of experience in Web Development
- Strong proficiency in ReactJS and Firebase
- Knowledge of Twilio API is a plus.
- Understanding of front-end development tools such as Webpack, NPM, etc.
- Ability to understand business requirements and translate them into technical requirements.
-
Any production project on ReactJs is a plus.
Roles and Responsibilities
- Developing new user-facing features using ReactJS.
- Building reusable components and front-end libraries for future use
- Translating UI/UX design wireframes into high-quality code for producing visual elements of the application.
-
Optimizing components and creating docs, unit, scenario, integration sanity test, etc. if required.
About Us:
Spacenos is the fastest-growing start-up which is innovating in the finance, edtech and marketing domain since 2015 and won multiple awards and recognitions from more than 40+ MNCs and Fortune 500 companies. Our Clients are based out of the U.S.A and Australia. We are funded & Supported by Government of Karnataka, Angel Investors and International Grants.
Hiring Process:
- Apply for your CV and past work to be reviewed.
- Receive a telephonic interview or assessment upon filling the final step form.
-
Receive offer letter if selected.
Hiring Duration:
Our hiring process takes less than 24 hours from the time you receive the Final Step form.
Validity: Up to Dec 2023
- Apply soon, the earliest applicant would be preferred over the late applicants.
Our client focuses on providing solutions in terms of data, analytics, decisioning and automation. They focus on providing solutions to the lending lifecycle of financial institutions and their products are designed to focus on systemic fraud prevention, risk management, compliance etc.
Our client is a one stop solution provider, catering to the authentication, verification and diligence needs of various industries including but not limited to, banking, insurance, payments etc.
Headquartered in Mumbai, our client was founded in 2015 by a team of three veteran entrepreneurs, two of whom are chartered accountants and one is a graduate from IIT, Kharagpur. They have been funded by tier 1 investors and have raised $1.1M in funding.
What you will do:
- Interacting with the clients to understand their requirements and translating it to the developer to take it forward
- Acting as a liaison between end users/clients and internal teams and helping them fulfil client requests and resolving queries with optimum time and efforts
- Contributing to implement various solutions in the existing process flow of clients using our products and helping to communicate concepts to the product team to enhance the future product requirements
Desired Candidate Profile
What you need to have:- CA, CFA, or related field
- Excellent communication and task management skills
- In depth knowledge of the various Income Tax Department websites, portals, and their workings, etc
- In depth knowledge of the Income Tax Department rules, regulations, guidelines, due-dates, services, facilities, etc
- In depth knowledge of Income Tax return filing processes using XML Upload, JSON Upload, Prefilled JSON, etc
- In depth knowledge of the Income Tax Filing Regimes
- In depth knowledge of Income Tax XML responses, JSON responses, data points pertaining to calculation of Financial Rations, Balance Sheet, P&L, etc
- In depth knowledge of E-Return Intermediaries and their rules, regulations, guidelines, permissions, compliances, etc
- Passable knowledge of GST, GST Portal and GST filing processes
- A good working knowledge of the financial industry and regulatory environment
- Ability to quickly grasp the various data sources that the company covers and gaining a hold over them over a period of time
- Understanding and translating statistics to address client business problems and liaisoning with the analytics team to build and deliver custom solutions
- Good understanding of Data query languages like SQL, MS Office, R/Python (good to have) and other statistical analysis tools
- Ability to be creative, analytical, and think outside the box to solve problems
We are looking for a passionate, highly driven, intrinsically motivated Associate Product Manager who wants to join a high-growth startup, learn something new everyday and join of one the most energetic and speedy Product Teams in health-tech!
Responsibilities
- Building and executing new initiatives and roadmaps for retention and increasing customer lifetime value.
- Understand the healthcare market, customers and build business cases for new product opportunities.
- Listen to the users on a regular basis and figure out the opportunities to solve their problems.
- Manage product lifecycle from ideation to launch and beyond which would include liaising with multiple stake-holders.
- Work closely with partners and clients from 4 continents to localize the product builds as per their need and ensure adoption of the new features.
- Use data, creativity, and experimentation to constantly improve the product experience
Requirements
- Have 1-4 years of experience in building and managing large-scale enterprise products.
- Have prior experience in the development team and understand how modern development frameworks function.
- Are passionate about translating customer needs to usable design and process flows for growth levers and optimization.
- Have Good understanding of funnels, agile & sprints, and wireframes.
- Display the attitude to be comfortable with ambiguity and have the skill to transform broad ideas into action plans and display Empathy towards users and also your colleagues.
- Understand how the application functions and the technicalities around it.
- Have familiarity with tools like Google data studio, Firebase, Google Analytics, Product Analytics tools, and Figma.
- Love analytics and very frequent experimentation
Node.js, MongoDB, Redis, ElasticSearch.


