

Affine
https://affine.aiAbout
Affine is a pioneering consulting firm that specializes in AI-driven enterprise transformation. With a robust focus on Generative AI, we deliver cutting-edge solutions that redefine industry standards.
Our comprehensive capabilities span the entire analytical value chain, from data to insights and transformation. We provide Cloud Advisory and Assessment, Cloud Migration, Data Lake Design and Development, Big Data Powered Advanced Analytics, AI & Deep Learning Solutions, and Deployment Consulting with Architecture Design.
Company social profiles
Jobs at Affine
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Programming Language: Python (Strong knowledge)
- Concurrency & Parallelism: Multithreading, Multiprocessing, AsyncIO, ThreadPoolExecutor, Future, concurrent.futures
- Memory Management: Reference Counting, Global Interpreter Lock (GIL)
- Distributed Computing: Dask, Apache Spark (Preferred)
- Data Processing: NumPy
- Inter-Service Communication: GRPC, REST API
- Containerization & Orchestration: Docker, Kubernetes
- Software Development Practices: Code Optimization, Debugging, Performance Tuning
- Communication & Problem-Solving: Technical Documentation, Team Collaboration, Asking for Clarity When Needed
Skills And Expertise
- Python,
- Multithreading,
- Multiprocessing,
- Dask, Apache Spark,
- NumPy,
- REST API,
- Docker,
- Kubernetes,
- Code Optimization
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Responsibility:
- Design, develop, and deploy Power BI reports and dashboards that provide key insights into business performance.
- Create and maintain complex Power BI data models, integrating data from multiple sources.
- Write and optimize SQL queries to extract, manipulate, and analyze data from various databases.
- Collaborate with cross-functional teams to understand business requirements and translate them into effective BI solutions.
- Perform data analysis using Excel, Power Query, and other tools to support reporting and analytics needs.
- Monitor and troubleshoot Power BI reports and data refresh schedules to ensure consistent performance.
- Implement security measures and ensure data is presented to the appropriate audience through role-based access.
- Continuously improve the usability, interactivity, and performance of reports.
- Provide training and support to end-users on Power BI usage.
Work Mode:
- 2 to 3 days a month will be work from client office in Gurugram/Hyderabad
- Preferred Location - Gurugram/Hyderabad.
- For other locations, candidate should be willing to travel to Gurugram/Hyderabad for a few days every month.
External Skills And Expertise
Must Have Skills:
- Excel
- SQL
- Power BI Dax
- Power Query
- Power BI Online Services
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Responsibilities:
- Code emails in Emarsys ESP environment using HTML and Emarsys tags.
- Integrate models from DS team using Emarsys and Snowflake data connections.
- Use JIRA to manage email development tickets.
- Dedicated PM to manage partnership between developers and BN email team, providing status updates of all projects. This person is the POC for all project discussions and help.
- Development needs - average of 9 emails/per week over the year.
- Ability to pick up and edit pre-built legacy emails and build email from scratch using creative Figma files.
- QA media prior to sharing to reduce errors in layouts, copy and links
External Skills And Expertise
Mandate Skills:
- Emarsys
- Digital Marketing
Optional Skills:
- SQL
- Excel
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are looking for a detail-oriented Business Analyst with expertise in Tableau, statistics, SQL, and Excel. The role involves data analysis, creating interactive dashboards, and providing actionable insights to support business decisions and improve operations.
Key Responsibilities:
- Data Analysis & Visualization: Use Tableau for dashboards and conduct statistical analysis.
- SQL & Data Management: Write and optimize SQL queries, ensure data accuracy.
- Reporting: Develop Excel spreadsheets, analyze business processes, and provide reports.
- Collaboration: Work with cross-functional teams to gather requirements and deliver insights.
- Project Management: Manage multiple projects, deliver timely reports, and support teams with data insights.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- SFDC(Data modelling experience) would be given preference
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
External Skills And Expertise
Must have Skills:
- Scala
- Spark
- SQL (Intermediate to advanced level)
- Spark Streaming
- AWS preferable/Any cloud
- Kafka /Kinesis/Any streaming services
- Object-Oriented Programming
- Hive, ETL/ELT design experience
- CICD experience (ETL pipeline deployment)
Good to Have Skills:
- AWS Certification
- Git/similar version control tool
- Knowledge in CI/CD, Microservices
Similar companies
About the company
Project-based businesses transform the world we live in. Deltek innovates and delivers software and solutions that power them to achieve their purpose. Our industry-specific software and information solutions maximize our customers' performance at every stage of the project lifecycle by enabling superior levels of project intelligence, management and collaboration.
Deltek is the recognized global standard for project-based businesses across government contracting and professional services industries, helping more than 30,000 organizations of all sizes deliver on their mission.
With over 4,200 employees worldwide, our team of industry experts is passionately committed to creating exceptional customer experiences.
Jobs
9
About the company
Deep Tech Startup Focusing on Autonomy and Intelligence for Unmanned Systems. Guidance and Navigation, AI-ML, Computer Vision, Information Fusion, LLMs, Generative AI, Remote Sensing
Jobs
4
About the company
Jobs
1
About the company
Jobs
2
About the company
At TheBlueOwls, we are passionate about harnessing the power of data analytics and artificial intelligence to transform businesses and drive innovation. With a team of experts and cutting-edge technology, we help our clients unlock valuable insights from their data and leverage AI solutions to stay ahead in today's competitive landscape.
Our Founder
Our company was founded by Puran Ticku, an ex-Microsoft Architect with over 20+ years of experience in the field of data and digital health. Puran Ticku has a deep understanding of the potential of data analytics and AI and has successfully led transformative solutions for numerous organizations.
Our Expertise
We specialize in providing comprehensive data analytics services, helping businesses make data-driven decisions and uncover hidden patterns and trends. With our advanced AI capabilities, we enable our clients to automate processes, enhance productivity, and gain a competitive edge in their industries.
Our Approach
At TheBlueOwls, we believe that the key to successful data analytics and AI implementation lies in a holistic approach. We work closely with our clients to understand their unique challenges and goals, and tailor our solutions to meet their specific needs. Our team of skilled professionals utilizes state-of-the-art technology and industry best practices to deliver exceptional results.
Our Commitment
We are committed to delivering the highest level of quality and value to our clients. We strive for excellence in every project we undertake, ensuring that our solutions are not only effective, but also scalable and sustainable. With our deep domain expertise and customer-centric approach, we are dedicated to driving success for our clients and helping them achieve their business objectives.
Contact us today to learn more about how TheBlueOwls can empower your organization with data analytics and AI solutions that drive growth and innovation.
Jobs
5
About the company
Jobs
18
About the company
Jobs
6
About the company
Planet Green is a U.S. leader in the remanufacturing and recycling of name-brand OEM printer cartridges. We offer over 3,000 models of ink and toner at 30%–70% below retail prices.
Our savings come from sourcing new, unwanted genuine OEM surplus printer cartridges from businesses that no longer need them, while also remanufacturing our own high-quality inkjet cartridges.
All products are backed by our 100% Satisfaction Guarantee.
Jobs
1



