Affine
https://affine.aiAbout
Affine is a pioneering consulting firm that specializes in AI-driven enterprise transformation. With a robust focus on Generative AI, we deliver cutting-edge solutions that redefine industry standards.
Our comprehensive capabilities span the entire analytical value chain, from data to insights and transformation. We provide Cloud Advisory and Assessment, Cloud Migration, Data Lake Design and Development, Big Data Powered Advanced Analytics, AI & Deep Learning Solutions, and Deployment Consulting with Architecture Design.
Company social profiles
Jobs at Affine
- Design and build data pipelines using Spark-SQL and PySpark in Azure Databricks
- Design and build ETL pipelines using ADF
- Build and maintain a Lakehouse architecture in ADLS / Databricks.
- Perform data preparation tasks including data cleaning, normalization, deduplication, type conversion etc.
- Work with DevOps team to deploy solutions in production environments.
- Control data processes and take corrective action when errors are identified. Corrective action may include executing a work around process and then identifying the cause and solution for data errors.
- Participate as a full member of the global Analytics team, providing solutions for and insights into data related items.
- Collaborate with your Data Science and Business Intelligence colleagues across the world to share key learnings, leverage ideas and solutions and to propagate best practices.
- You will lead projects that include other team members and participate in projects led by other team members.
- Apply change management tools including training, communication and documentation to manage upgrades, changes and data migrations.
Job Responsibility:
- Design, develop, and deploy Power BI reports and dashboards that provide key insights into business performance.
- Create and maintain complex Power BI data models, integrating data from multiple sources.
- Write and optimize SQL queries to extract, manipulate, and analyze data from various databases.
- Collaborate with cross-functional teams to understand business requirements and translate them into effective BI solutions.
- Perform data analysis using Excel, Power Query, and other tools to support reporting and analytics needs.
- Monitor and troubleshoot Power BI reports and data refresh schedules to ensure consistent performance.
- Implement security measures and ensure data is presented to the appropriate audience through role-based access.
- Continuously improve the usability, interactivity, and performance of reports.
- Provide training and support to end-users on Power BI usage.
Work Mode:
- 2 to 3 days a month will be work from client office in Gurugram/Hyderabad
- Preferred Location - Gurugram/Hyderabad.
- For other locations, candidate should be willing to travel to Gurugram/Hyderabad for a few days every month.
External Skills And Expertise
Must Have Skills:
- Excel
- SQL
- Power BI Dax
- Power Query
- Power BI Online Services
Job Responsibilities:
- Code emails in Emarsys ESP environment using HTML and Emarsys tags.
- Integrate models from DS team using Emarsys and Snowflake data connections.
- Use JIRA to manage email development tickets.
- Dedicated PM to manage partnership between developers and BN email team, providing status updates of all projects. This person is the POC for all project discussions and help.
- Development needs - average of 9 emails/per week over the year.
- Ability to pick up and edit pre-built legacy emails and build email from scratch using creative Figma files.
- QA media prior to sharing to reduce errors in layouts, copy and links
External Skills And Expertise
Mandate Skills:
- Emarsys
- Digital Marketing
Optional Skills:
- SQL
- Excel
Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- Good understanding in object-oriented concepts and hands on experience on Kotlin/Scala/Java with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Kotlin/Scala/Java
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on cloud (AWS preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
External Skills And Expertise
Must have Skills:
- Kotlin/Scala/Java
- Spark
- SQL (Intermediate to advanced level)
- Spark Streaming
- Any cloud(AWS preferable)
- Kafka /Kinesis/Any streaming services
- Object-Oriented Programming
- Hive, ETL/ELT design experience
- CICD experience (ETL pipeline deployment)
Good to Have Skills:
- AWS Certification
- Git/similar version control tool
- Knowledge in CI/CD, Microservices
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are looking for a detail-oriented Business Analyst with expertise in Tableau, statistics, SQL, and Excel. The role involves data analysis, creating interactive dashboards, and providing actionable insights to support business decisions and improve operations.
Key Responsibilities:
- Data Analysis & Visualization: Use Tableau for dashboards and conduct statistical analysis.
- SQL & Data Management: Write and optimize SQL queries, ensure data accuracy.
- Reporting: Develop Excel spreadsheets, analyze business processes, and provide reports.
- Collaboration: Work with cross-functional teams to gather requirements and deliver insights.
- Project Management: Manage multiple projects, deliver timely reports, and support teams with data insights.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- SFDC(Data modelling experience) would be given preference
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
External Skills And Expertise
Must have Skills:
- Scala
- Spark
- SQL (Intermediate to advanced level)
- Spark Streaming
- AWS preferable/Any cloud
- Kafka /Kinesis/Any streaming services
- Object-Oriented Programming
- Hive, ETL/ELT design experience
- CICD experience (ETL pipeline deployment)
Good to Have Skills:
- AWS Certification
- Git/similar version control tool
- Knowledge in CI/CD, Microservices
Similar companies
Deltek
About the company
Project-based businesses transform the world we live in. Deltek innovates and delivers software and solutions that power them to achieve their purpose. Our industry-specific software and information solutions maximize our customers' performance at every stage of the project lifecycle by enabling superior levels of project intelligence, management and collaboration.
Deltek is the recognized global standard for project-based businesses across government contracting and professional services industries, helping more than 30,000 organizations of all sizes deliver on their mission.
With over 4,200 employees worldwide, our team of industry experts is passionately committed to creating exceptional customer experiences.
Jobs
14
Fractal Analytics
About the company
Fractal is one of the most prominent players in the Artificial Intelligence space.Fractal's mission is to power every human decision in the enterprise and brings Al, engineering, and design to help the world's most admire Fortune 500® companies.
Fractal's products include Qure.ai to assist radiologists in making better diagnostic decisions, Crux Intelligence to assist CEOs and senior executives make better tactical and strategic decisions, Theremin.ai to improve investment decisions, Eugenie.ai to find anomalies in high-velocity data, Samya.ai to drive next-generation Enterprise Revenue Growth Manage- ment, Senseforth.ai to automate customer interactions at scale to grow top-line and bottom-line and Analytics Vidhya is the largest Analytics and Data Science community offering industry-focused training programs.
Fractal has more than 3600 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, a leader in Analytics & Al Services Specialists Peak Matrix 2021 by Everest Group and recognized as an "Honorable Vendor" in 2022 Magic Quadrant™™ for data & analytics by Gartner. For more information, visit fractal.ai
Jobs
4
Technoidentity
About the company
Founded a decade ago, a group of seasoned professionals came together to ask ‘How might we use technology to bring the world closer to social justice?’ A small team was put together to create incredible solutions for organisations working towards delivering large scale impact. Today, we’re a global organisation providing disruptive software solutions to our customers who are solving important problems in their industry and also the world we live in. At TechnoIdentity, we challenge our team of passionate technologists to create forward-looking solutions for our customers that improve operational efficiency and bolster their bottom line. We empower them with the tools, methodologies and decision-making to make real impact, fast. We invite creative minds and businesses to TechnoIdentity, let’s collaborate to transform lives through technology.
Life at TechnoIdentity is shaped by the belief that everyone deserves to bring their authentic self to work and as responsible adults, you will choose to spend your time wisely on things matters the most to you.
Jobs
1
Vikash Technologies
About the company
Jobs
1
HyrHub
About the company
Jobs
9
Wissen Technology
About the company
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.
With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.
We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).
Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Jobs
175
GreenFortune
About the company
We are India's fastest-growing window and door system brand based out of Hyderabad. We have completed 800+ successful projects through a network of 150+ fabricators present across 100+ locations across the country. Our proprietary and tech-first approach (PartnerGate) has made us a preferred partner for businesses and a trusted brand for customers in the segment.
We currently manufacture UPVC windows and doors. Our products combine durability, energy efficiency, and modern design to meet the growing demands of homeowners and builders.
Our products are developed by our in-house innovation and product development team for Indian climatic conditions meeting international standards (EN ISO, ASTM). They are manufactured in ISO-certified manufacturing facilities with a production capacity exceeding 100,000MT.
We have raised $1.04 million (about Rs 8.5 crore) in seed funding led by Incubate Fund India, Titan Capital, Partners Fund Japan, Superb Capital, and MamaEarth founder Varun Alagh. Within one year we have grown 7X.
We operate in a B2B2C, B2B and B2C model. Eventually, B2C will be our primary channel with a pan-India presence. Within the decade, we also plan to export our products to markets in the world.
Jobs
1
vybog
About the company
IT Consulting & Services
Product Developement
Jobs
2
CARZFIXUP
About the company
Jobs
5
Guru Goutam Infotech Pvt Ltd
About the company
Jobs
1