
Looking for a AWS Architect for our client branch in Noida
Experience: 8+ years of relevant experience
Location: Noida
Notice period : Immediate to 30 days joiners
Must have: AWS, Spark, Python, AWS Glue, AWS S3, AWS Redshift, Sage Maker, AppFlow, Data Lake Formation, AWS Security, Lambda, EMR, SQL
Good to have:
Cloud Formation, VPC, Java, Snowflake, Talend, RDS, Data Bricks
Responsibilities:
- Mentor Data Architect & Team and also able to Architect and implement ETL and data movement solutions like ELT/ETL/Data pipelines.
- Apply industry best practices, security, manage compliance, adopt appropriate design patterns etc
- Able to work and manage tests, monitors, manage and validate data warehouse activity including data extraction, transformation, movement, loading, cleansing, and updating processes.
- Able to code, test and dataset implementation and maintain and define data engineering standards
- Architect, Design and implement database solutions related to Big Data, Real-time/ Data Ingestion, Data Warehouse, SQL/NoSQL
- Architect, Design and implement automated workflows and routines using workflow scheduling tools
- Identify and Automate to optimize the performance.
- Conduct appropriate functional and performance testing to identify bottlenecks and data quality issues.
- Be able to implement slowly changing dimensions as well as transaction, accumulating snapshot, and periodic snapshot fact tables.
- Collaborate with business users, EDW team members, and other developers throughout the organization to help everyone understand issues that affect the data warehouse.

Similar jobs

-
Manage individual projects priorities, deadlines, and deliverables
-
Gather and process raw data at scale (including writing scripts, web scraping, calling/create
APIs, etc.) from the web / internet
-
Develop frameworks for automating and maintaining constant flow of data from multiple
sources
-
Identify, analysis, design, and implement internal process improvements
-
Design and implement tooling upgrades to increase stability and data quality
-
Help team to fix issues that occur in test and production environments
-
Automate software development processes, including build, deploy, and test
-
Manage and guide the team members
REQUIRED QUALIFICATIONS:
|
|
Role & responsibilities
1. Develop and maintain native iOS applications using Swift and Objective-C.
2. Familiarity with RESTful APIs to connect iOS applications to back-end services
3. Work with APIs, third-party libraries and Familiarity with MVC, MVVM.
4. Proficient in code versioning tools including Github and Gitlab.
5. Familiarity with push notifications, APIs, and cloud messaging.
6. Identifying potential problems and resolving application bugs.
7. Fixing application bugs before the final release.
8. Publishing application on App Store.
9. Maintain code quality and Ensure the performance, quality, and responsiveness of applications.
10. Collaborate with cross-functional teams for seamless integration.
11. Stay up to date with the latest industry trends and technologies.
Preferred candidate profile
- Minimum 3+ Year of IOS Developer Experience or Created an App Before.
- Strong knowledge of iOS frameworks [UIKit, SwiftUI, Core Data]
- Experience with UI/UX Designing, RESTful APIs and third-party libraries.
- Problem solver with good analytical skills.
- Knowledge of Firebase Integration is an added advantage.
- Previous experience in fintech industry considered as plus point.
Perks and benefits
- Competitive Salary and Professional development opportunities and training.
- Opportunity to work with cutting-edge technologies in a fast-paced environment.
- Collaborative and supportive work environment.
Apply: https://forms.gle/5aZjp5New3cxGZXN9
Role Description
This is a full-time client facing on-site role for a Data Scientist at UpSolve Solutions in Mumbai. The Data Scientist will be responsible for performing various day-to-day tasks, including data science, statistics, data analytics, data visualization, and data analysis. The role involves utilizing these skills to provide actionable insights to drive business decisions and solve complex problems.
Qualifications
- Data Science, Statistics, and Data Analytics skills
- Data Visualization and Data Analysis skills
- Strong problem-solving and critical thinking abilities
- Ability to work with large datasets and perform data preprocessing
- Proficiency in programming languages such as Python or R
- Experience with machine learning algorithms and predictive modeling
- Excellent communication and presentation skills
- Bachelor's or Master's degree in a relevant field (e.g., Computer Science, Statistics, Data Science)
- Experience in the field of video and text analytics is a plus
Requirements:
- Hands-on working knowledge and experience is required in:
- Extensive experience working on C#, . net and. net core frameworks
- Extensive experience working on React, Javascript and Typescript
- Relational Databases (SQL Server, Oracle, PostgreSQL, etc. )
- NoSQL Databases (Mongo, Cloud Spanner, etc. )
- Agile Methodologies (Scrum, TDD, BDD, etc. )
- Experience working with distributed teams across regions and time zones
- Strong organisational skills
- Display detailed, critical, quality-oriented, skeptical thinking about the product
- Experience with several of the following tools/technologies is desirable: GIT, Jira, Jenkins, SharePoint, Visual Studio Code.
- Microservices Architecture, Domain Driver Design, Test Driven Development is a bonus
- Design Patterns and implementing the Design Patterns
- Development of Complex Application and System Architectures
- Data Structures and Algorithms using Typescript, C# and. NET
- Experience working in Google Cloud will be a big bonus as all our systems are in the Cloud 7
- Knowledge of REST and gRPC API's is a bonus
- Knowledge of the following technologies is a plus:
- Continuous Integration and Continuous Delivery Tools like GitHub, Git, etc.
- Containerisation Technologies (Docker)
Job Description
We are looking for a Product Marketing Associate who will streamline our product marketing efforts. You will work closely with the marketing, sales and product teams. You will help research, compile and develop positioning, messaging and communication for marketing and sales.
Responsibilities
Research and manage marketing and sales collaterals
Research and structure content for webpages and landing pages
Create high-quality content for Gumlet in the form of sales/marketing collaterals, email communications, website and landing page copy.
Ideate and help build compelling narratives and messaging for Gumlet across marketing channels
Streamline and overlook lead nurturing campaigns
Organize and track content distribution
Conduct competitive, market and customer research, and build usable insights to further the marketing and sales efforts.
Work on projects to improve and track the middle and bottom of the funnel
Work closely with sales and marketing teams to identify opportunities for new client acquisition.
Skills And Qualification
Work experience as a marketing associate or a similar role
In-depth understanding of marketing fundamentals
Excellent communication skills
Knack for copywriting
Analytical abilities
Education requirements
BE/BTech or equivalent degree.
0.6 - 1 year of core marketing experience, preferably with Tech / Product / SaaS domain companies
Excellent command of the U. S. English (understanding, speaking, and writing).
Responsibilities:
- Ability to work in an existing codebase and collaborate with a diverse team
- Experience in building enterprise-scale backend REST APIs with frameworks such as Nest.js & Express.js using an API-first paradigm
- Intimate knowledge of crafting highly performing database queries
- Hands-on experience implementing relational database structures, including tables, indexes, views, etc.
- A mindset towards building systems for the cloud and DevOps fundamentals
- Working knowledge of cloud infrastructure services is good to have. If not then willingness to learn should be there.
- Focus towards building security, performance, and scalability into services from the beginning
- Experience with debugging code and troubleshooting technical issues in order to craft appropriate solutions
Technologies & Languages
- Azure
- Databricks
- SQL Sever
- ADF
- Snowflake
- Data Cleaning
- ETL
- Azure Devops
- Intermediate Python/Pyspark
- Intermediate SQL
- Beginners' knowledge/willingness to learn Spotfire
- Data Ingestion
- Familiarity with CI/CD or Agile
Must have:
- Azure – VM, Data Lake, Data Bricks, Data Factory, Azure DevOps
- Python/Spark (PySpark)
- SQL
Good to have:
- Docker
- Kubernetes
- Scala
He/she should have a good understanding in:
- How to build pipelines – ETL and Injection
- Data Warehousing
- Monitoring
Responsibilities:
Must be able to write quality code and build secure, highly available systems.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc with the guidance.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Monitoring performance and advising any necessary infrastructure changes.
Defining data retention policies.
Implementing the ETL process and optimal data pipeline architecture
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
Create design documents that describe the functionality, capacity, architecture, and process.
Develop, test, and implement data solutions based on finalized design documents.
Work with data and analytics experts to strive for greater functionality in our data systems.
Proactively identify potential production issues and recommend and implement solutions







