Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
at Exusia

About Exusia
About
Connect with the team
Similar jobs
Role overview:
- Must have About 5 - 11 years and at least 3 years relevant experience with Bigdata.
- Must have Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with huge amounts of data.
- Must have experience in Hadoop, Hive, Spark with Scala with good experience in performance tuning and debugging issues.
- Good to have any stream processing Spark/Java Kafka.
- Must have experience in design and development of Big data projects.
- Good knowledge in Functional programming and OOP concepts, SOLID principles, design patterns for developing scalable applications.
- Familiarity with build tools like Maven.
- Must have experience with any RDBMS and at least one SQL database preferably PostgresSQL
- Must have experience writing unit and integration tests using scaliest
- Must have experience using any versioning control system - Git
- Must have experience with CI / CD pipeline – Jenkins is a plus
- Basic hands-on experience in one of the cloud provider (AWS/Azure) is a plus
- Databricks Spark certification is a plus.
What would you do here:
As a Software Development Engineer 2 you will be responsible for expanding and optimising our data and data pipeline architecture as well as optimising data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline design and data wrangler who enjoys optimising data systems and building them from the ground up. The Data Engineer will lead our software developers on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimising or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
Responsibilities:
•Create and maintain optimal data pipeline architecture
•Assemble large complex data sets that meet functional / non-functional business requirements.
•Identify design and implement internal process improvements: automating manual processes optimising data delivery, coordinating to re-design infrastructure for greater scalability etc.
•Work with stakeholders including the Executive Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
•Keep our data separated and secure
•Work with data and analytics experts to strive for greater functionality in our data systems.
- Support PROD systems
Role - Senior Analytics Executive
Experience - 1-2 years
Location - Open (Remote working option available)
About Company :-
Our team is made up of best in class digital, offline, and integrated media experts who work together to enhance media's contribution to Google's business. Our team operates in a seamlessly integrated way across strategy, planning, investment, creative, business sciences and analytics, data and technology. The profile of people who work are world class, committed to establishing a new high water mark in the media industry.
About the role/ Some of the things we'd like you to do:
- Support the Analytics team and other stakeholders with analytics agendas impacting campaigns, measurement frameworks, and campaign optimization.
- Conduct thorough data analysis using various tools and software within MFG to provide insights and recommendations that align with client needs.
- Collaborate with internal stakeholders across disciplines and countries to understand client objectives, providing support and expertise in data and analytics while identifying opportunities for advanced analytics solutions.
- Formulate strategic recommendations with the team based on gathered insights, and support the delivery of reporting for clients.
- Continuously improve project performance and processes by collaborating with the team to develop new tools and solutions.
About yourself/ Requirements:
- Bachelor's degree in a related quantitative field (e.g. Statistics, Business Analytics, Economics, Computer Science, etc.)
- 1-2 years of relevant work experience in data analysis; digital media experience desired
- Strong knowledge of various data analysis tools and software (e.g., Excel, SQL, R, Python, Tableau).
- Is proficient in statistical principles and how they apply to tasks/work items.
- Excellent problem-solving skills and the ability to analyze complex data sets.
- Strong communication and interpersonal skills, with the ability to present data-driven insights to both technical and non-technical audiences.
- Ability to work independently and as part of a team, with strong collaboration skills.
- Demonstrated ability to manage multiple projects and prioritize tasks effectively.
- Passion tor continuous learning and staying current with industry trends and best practices in analytics.
Job Title: Data Analyst Associate – Data Management
Location: Mumbai, India
Company: Wissen Technology
About Us:
Wissen Technology is a leading technology and consulting firm known for its strong commitment to excellence, innovation, and customer satisfaction. We are passionate about harnessing data to drive smart business decisions and deliver impactful insights. Join us in shaping the future of data management.
Role Overview:
We are looking for a meticulous and proactive Data Analyst Associate - Data Management to join our team. This role requires a keen attention to detail, solid organizational skills, and a foundational knowledge of the asset management industry. You will support data analysis, management, and reporting initiatives, focusing on ensuring accuracy, efficiency, and improved decision-making processes.
Key Responsibilities:
- Data Management: Support data integrity and accuracy across various data systems; handle data extraction, transformation, and loading (ETL) processes.
- Analysis & Reporting: Generate insightful reports and conduct data analysis, utilizing Microsoft Excel and Business Objects; provide data support for decision-making processes.
- Collaboration: Engage with global business areas, provide data-related support, and actively participate in relevant meetings.
- Documentation & Communication: Document data management procedures, contribute to data governance practices, and effectively communicate findings with team members and stakeholders.
- Process Improvement: Identify and implement improvements in data operations to enhance efficiency and effectiveness.
- Technical Proficiency: Utilize data management tools and develop insights using BI reporting tools, including Tableau or Power BI; familiarity with Python is a plus.
Qualifications:
- Education: BTech / BE / MCA or any related field
- Industry Knowledge: Basic understanding of the asset management industry and data management practices.
- Technical Skills: Proficiency in Microsoft Excel, with foundational knowledge of Microsoft Access and Business Objects. Knowledge of BI reporting tools (Tableau, Power BI) is preferred, and Python skills are a plus.
- Communication Skills: Strong written and verbal communication abilities, with a focus on customer service and responsiveness.
- Organizational Skills: Strong multitasking, prioritization, and time management skills; attention to detail is essential.
- Project Management: Ability to coordinate and manage project tasks effectively, meeting deadlines with minimal supervision.
Why Join Wissen Technology?
- Opportunity to be part of a growing team focused on data-driven innovation and quality.
- Exposure to global clients and complex data management projects.
- Competitive benefits, including health coverage, paid time off, and a collaborative work environment.
We look forward to welcoming a detail-oriented and driven Data Analyst Associate to our team!
Role: Principal Software Engineer
We looking for a passionate Principle Engineer - Analytics to build data products that extract valuable business insights for efficiency and customer experience. This role will require managing, processing and analyzing large amounts of raw information and in scalable databases. This will also involve developing unique data structures and writing algorithms for the entirely new set of products. The candidate will be required to have critical thinking and problem-solving skills. The candidates must be experienced with software development with advanced algorithms and must be able to handle large volume of data. Exposure with statistics and machine learning algorithms is a big plus. The candidate should have some exposure to cloud environment, continuous integration and agile scrum processes.
Responsibilities:
• Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule
• Software Development that creates data driven intelligence in the products which deals with Big Data backends
• Exploratory analysis of the data to be able to come up with efficient data structures and algorithms for given requirements
• The system may or may not involve machine learning models and pipelines but will require advanced algorithm development
• Managing, data in large scale data stores (such as NoSQL DBs, time series DBs, Geospatial DBs etc.)
• Creating metrics and evaluation of algorithm for better accuracy and recall
• Ensuring efficient access and usage of data through the means of indexing, clustering etc.
• Collaborate with engineering and product development teams.
Requirements:
• Master’s or Bachelor’s degree in Engineering in one of these domains - Computer Science, Information Technology, Information Systems, or related field from top-tier school
• OR Master’s degree or higher in Statistics, Mathematics, with hands on background in software development.
• Experience of 8 to 10 year with product development, having done algorithmic work
• 5+ years of experience working with large data sets or do large scale quantitative analysis
• Understanding of SaaS based products and services.
• Strong algorithmic problem-solving skills
• Able to mentor and manage team and take responsibilities of team deadline.
Skill set required:
• In depth Knowledge Python programming languages
• Understanding of software architecture and software design
• Must have fully managed a project with a team
• Having worked with Agile project management practices
• Experience with data processing analytics and visualization tools in Python (such as pandas, matplotlib, Scipy, etc.)
• Strong understanding of SQL and querying to NoSQL database (eg. Mongo, Casandra, Redis
We are backed by marquee VC investors – Elevation Capital and Matrix Partners – to realize this mission. We started with three co-founders who have a pedigreed education (IIT/IIM) and professional background (Nomura, Goldman Sachs, Morgan Stanley, Barclays, Matrix Partners).
📈 Key Responsibilities
● Data and Business Analysis:
○ Lead and manage key analytical initiatives to evaluate and expand the business realized (or GTV) from our products and services
○ Closely work with the internal stakeholders to understand data requirements and make the internal platforms and product live
○ Work closely with the Product and Tech team to onboard new institutes on our tech platform
○ Create independent analysis to identify gaps in current systems and optimize data processes
● Stakeholder Management :
○ Build and maintain a strong relationship across teams of account managers, Tech and Product teams to manage critical tasks, ensure delivery and develop tools and systems to improve the process TAT
📃Desired Qualifications
● Expertise in Excel and SQL and familiarity with Python
● Good to have knowledge of other BI tools such Quicksight, Tableau
● Outstanding written and verbal communication skills
● 1-3 years of experience working in analytics or data operations in Fintech or consumer internet domain (desirable but not a requirement)
✅Measures Of Outcomes
● Schedule adherence to tasks
● Number of business processes changed due to vital analysis
● Number of stakeholder appreciations/escalations
🤩 Benefits
● Becoming a part of the early team
● Competitive salary
● Work with colleagues from strong backgrounds & hungry to succeed
● Opportunity to interact with and learn from high-pedigree investors & mentors -
○ VCs: Tiger Global, Elevation Capital (erstwhile SAIF Partners), Matrix Partners
○ Select angel investors: Kunal Shah, Nithin Kamath, Amit Ranjan
✅ Key Details
● Role: Analyst/Associate
● Location: Bengaluru
● Compensation: Negotiable based on candidate's profile
● Date of Joining: ASAP
make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.
Responsibilities:
- Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
- Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
- Become an expert on data and trends, both internal and external to Kaleidofin.
- Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
- Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
- Automate scheduling and distribution of reports and support auditing and value realization.
- Partner with enterprise architects to define and ensure proposed.
- Business Intelligence solutions adhere to an enterprise reference architecture.
- Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks
Requirements:
- Experience leading development efforts through all phases of SDLC.
- 5+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
- Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
- Hands on experience in SQL, data management, and scripting (preferably Python).
- Strong data visualisation design skills, data modeling and inference skills.
- Hands-on and experience in managing small teams.
- Financial services experience preferred, but not mandatory.
- Strong knowledge of architectural principles, tools, frameworks, and best practices.
- Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
- Team handling preferred for 5+yrs experience candidates.
- Notice period less than 30 days.
(Hadoop, HDFS, Kafka, Spark, Hive)
Overall Experience - 8 to 12 years
Relevant exp on Big data - 3+ years in above
Salary: Max up-to 20LPA
Job location - Chennai / Bangalore /
Notice Period - Immediate joiner / 15-to-20-day Max
The Responsibilities of The Senior Data Engineer Are:
- Requirements gathering and assessment
- Breakdown complexity and translate requirements to specification artifacts and story boards to build towards, using a test-driven approach
- Engineer scalable data pipelines using big data technologies including but not limited to Hadoop, HDFS, Kafka, HBase, Elastic
- Implement the pipelines using execution frameworks including but not limited to MapReduce, Spark, Hive, using Java/Scala/Python for application design.
- Mentoring juniors in a dynamic team setting
- Manage stakeholders with proactive communication upholding TheDataTeam's brand and values
A Candidate Must Have the Following Skills:
- Strong problem-solving ability
- Excellent software design and implementation ability
- Exposure and commitment to agile methodologies
- Detail oriented with willingness to proactively own software tasks as well as management tasks, and see them to completion with minimal guidance
- Minimum 8 years of experience
- Should have experience in full life-cycle of one big data application
- Strong understanding of various storage formats (ORC/Parquet/Avro)
- Should have hands on experience in one of the Hadoop distributions (Hortoworks/Cloudera/MapR)
- Experience in at least one cloud environment (GCP/AWS/Azure)
- Should be well versed with at least one database (MySQL/Oracle/MongoDB/Postgres)
- Bachelor's in Computer Science, and preferably, a Masters as well - Should have good code review and debugging skills
Additional skills (Good to have):
- Experience in Containerization (docker/Heroku)
- Exposure to microservices
- Exposure to DevOps practices - Experience in Performance tuning of big data applications









