
Role : SAP ABAP Lead Consultant
Experience : 10+Years
Location : Hyderabad
Project Duration : 2+Years
Secondary Skills : SAP ABAP-HR, FIORI, WEBdynpro

Similar jobs
**Company**
Crypto made easy 🚀 We are the bridge between your crypto world and everyday life; trade pairs, book flights and hotels, and purchase gift cards with your favourite currencies. All in one best-in-class digital experience. It's not rocket science.
**Why Join?**
By joining CryptoXpress, you'll be at the cutting edge of merging digital currency with real-world services and products. We offer a stimulating work environment where innovation and creativity are highly valued. This remote role provides the flexibility to work from any location, promoting a healthy work-life balance. We are dedicated to fostering growth and learning, offering ample opportunities for professional development in the rapidly expanding fields of AI, blockchain technology, and e-commerce.
**Role Description**
We're looking for an enthusiastic Junior Growth Marketing Associate to join our marketing team. This is a perfect opportunity for someone starting their career in marketing who has a passion for cryptocurrency and technology. You'll work closely with our marketing team to learn and execute growth strategies while building a strong foundation in crypto marketing.
**Responsibilities**
- Research and propose innovative growth initiatives combining crypto incentives with travel experiences
- Create strategic briefs and execution roadmaps for the marketing team
- Identify viral marketing opportunities at the intersection of Web3 and travel
- Lead ideation sessions for new user acquisition and engagement campaigns
- Analyze competitors and market trends to spot growth opportunities
- Design A/B testing frameworks to validate growth hypotheses
- Present growth recommendations backed by market research and data
- Coordinate with design, dev, and marketing teams to bring ideas to life
- Break down campaign concepts into actionable tasks for team execution
- Monitor campaign results and recommend optimization strategies
- Stay ahead of emerging trends in both crypto and travel industries
Qualifications
- No degree requirement
Required Skills
- Basic Digital Marketing
- Social Media Management
- Written Communication
- Analytics
- Organizational Skills
- Microsoft Office/Google Suite
- Use Ai to speed up your work
**How to Apply:**
Interested candidates must complete the application form at https://forms.gle/HuCbCCbVvh49LVKk7
Join us and help shape the future of social media marketing in the cryptocurrency space!
**Tips for Application Success:**
- Show your enthusiasm for financial products
- Mention any self-learning initiatives
- Be honest about what you don't know
- Explore and understand what CryptoXpress does before applying

AMAZECH SOLUTIONS: COMPANY PROFILE
Amazech Solutions is a Consulting and Services company in the Information Technology Industry. Established in 2007, we are headquartered in Frisco, Texas, U.S.A. The leadership team at Amazech brings to the table expertise that stems from over 40-man years of experience in developing software solutions in global organizations in various verticals including Healthcare, Banking Services, and Media & Entertainment
We currently provide services to a wide spectrum of clients ranging from start-ups to Fortune 500 companies. We are actively engaged in Government projects, being an SBA approved company as well as being HUB certified by the State of Texas.
Our customer-centric approach comes from understanding that our clients need more than technology professionals. This is an exciting time to join Amazech as we look to grow our team in India which comprises of IT professionals with strong competence in both common and niche skill areas.
Location : Bangalore(Hybrid)
Employment type: Full time.
Skills Set :.net , angular.
Permanent website: www.amazech.com
Description:
As a Data Engineering Lead at Company, you will be at the forefront of shaping and managing our data infrastructure with a primary focus on Google Cloud Platform (GCP). You will lead a team of data engineers to design, develop, and maintain our data pipelines, ensuring data quality, scalability, and availability for critical business insights.
Key Responsibilities:
1. Team Leadership:
a. Lead and mentor a team of data engineers, providing guidance, coaching, and performance management.
b. Foster a culture of innovation, collaboration, and continuous learning within the team.
2. Data Pipeline Development (Google Cloud Focus):
a. Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP) using services such as BigQuery, Dataflow, and Dataprep.
b. Implement best practices for data extraction, transformation, and loading (ETL) processes on GCP.
3. Data Architecture and Optimization:
a. Define and enforce data architecture standards, ensuring data is structured and organized efficiently.
b. Optimize data storage, processing, and retrieval for maximum
performance and cost-effectiveness on GCP.
4. Data Governance and Quality:
a. Establish data governance frameworks and policies to maintain data quality, consistency, and compliance with regulatory requirements. b. Implement data monitoring and alerting systems to proactively address data quality issues.
5. Cross-functional Collaboration:
a. Collaborate with data scientists, analysts, and other cross-functional teams to understand data requirements and deliver data solutions that drive business insights.
b. Participate in discussions regarding data strategy and provide technical expertise.
6. Documentation and Best Practices:
a. Create and maintain documentation for data engineering processes, standards, and best practices.
b. Stay up-to-date with industry trends and emerging technologies, making recommendations for improvements as needed.
Qualifications
● Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
● 5+ years of experience in data engineering, with a strong emphasis on Google Cloud Platform.
● Proficiency in Google Cloud services, including BigQuery, Dataflow, Dataprep, and Cloud Storage.
● Experience with data modeling, ETL processes, and data integration. ● Strong programming skills in languages like Python or Java.
● Excellent problem-solving and communication skills.
● Leadership experience and the ability to manage and mentor a team.

- Discussing the project aims with the company requirement and development team.
- Designing and building web applications using Laravel.
- Troubleshooting issues in the implementation and debug builds.
- Working with front-end and back-end developers on projects.
- Testing functionality for users and the backend.
- Ensuring that integrations run smoothly.
- Scaling projects based on Company feedback.
- Recording and reporting on work done in Laravel.
- Maintaining web-based applications.
- Presenting work in meetings with clients and management.
As a MLOps Engineer in QuantumBlack you will:
Develop and deploy technology that enables data scientists and data engineers to build, productionize and deploy machine learning models following best practices. Work to set the standards for SWE and
DevOps practices within multi-disciplinary delivery teams
Choose and use the right cloud services, DevOps tooling and ML tooling for the team to be able to produce high-quality code that allows your team to release to production.
Build modern, scalable, and secure CI/CD pipelines to automate development and deployment
workflows used by data scientists (ML pipelines) and data engineers (Data pipelines)
Shape and support next generation technology that enables scaling ML products and platforms. Bring
expertise in cloud to enable ML use case development, including MLOps
Our Tech Stack-
We leverage AWS, Google Cloud, Azure, Databricks, Docker, Kubernetes, Argo, Airflow, Kedro, Python,
Terraform, GitHub actions, MLFlow, Node.JS, React, Typescript amongst others in our projects
Key Skills:
• Excellent hands-on expert knowledge of cloud platform infrastructure and administration
(Azure/AWS/GCP) with strong knowledge of cloud services integration, and cloud security
• Expertise setting up CI/CD processes, building and maintaining secure DevOps pipelines with at
least 2 major DevOps stacks (e.g., Azure DevOps, Gitlab, Argo)
• Experience with modern development methods and tooling: Containers (e.g., docker) and
container orchestration (K8s), CI/CD tools (e.g., Circle CI, Jenkins, GitHub actions, Azure
DevOps), version control (Git, GitHub, GitLab), orchestration/DAGs tools (e.g., Argo, Airflow,
Kubeflow)
• Hands-on coding skills Python 3 (e.g., API including automated testing frameworks and libraries
(e.g., pytest) and Infrastructure as Code (e.g., Terraform) and Kubernetes artifacts (e.g.,
deployments, operators, helm charts)
• Experience setting up at least one contemporary MLOps tooling (e.g., experiment tracking,
model governance, packaging, deployment, feature store)
• Practical knowledge delivering and maintaining production software such as APIs and cloud
infrastructure
• Knowledge of SQL (intermediate level or more preferred) and familiarity working with at least
one common RDBMS (MySQL, Postgres, SQL Server, Oracle)

Responsibilities:
● Take end to end ownership of products.
● You're able to work with Product Managers, Designers and call out things upfront
from an engineering perspective.
● Always keeps end customers in mind while building the solutions.
● Always takes a step forward towards code quality and better approaches instead
of quick and short term implementations in terms of hacks.
Requirements:
● Zeal to learn and make an impact.
● You should be solving for problems and not reasons.
● Able to drive problem statements from design to implementation.
● Open to give and take feedbacks.
● Should be able to work independently with little or no hand holding.
● You have prior experience of working with JavaScript especially React and it's
ecosystem.
● It'll be a bonus if you've worked with GraphQL.
● You're an engineer by heart and not just a developer.
● You're innovative and able to pitch ideas to your colleagues.
● You're not hesitant to come up with solutions.
● You care about product as if its your own.
● You're able to communicate well.
● You love hanging out.
Must Have Skills:



● We believe that the role of an engineer at a typical product company in India has to evolve from just working in a request response mode to something more involved.
● Typically an engineer has very little to no connection with the product, its users, overall success criteria or long term vision of the product that he/she is working on.
● The system is not setup to encourage it. Engineers are evaluated on their tech prowess and very little attention is given to other aspects of being a successful engineer.
● We don’t hold appraisals as we don’t believe that evaluation of work and feedback is a constant affair rather than every 6 or 12 months. Besides there is no better testament of your abilities than the growth of the product.
● We don’t have a concept of hierarchy and hence we don’t have promotions. All we have in Udaan are Software Engineers.
Skills & Knowledge:
○ 4-15 years of experience
○ Sound knowledge in Programming,
○ High Ownership & Impact oriented
○ Creative thinker & Implementation
○ Highly Customer Obsessed & Always Insisting on Highest Standards

Role and Responsibilities
- Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics functionality
- Build robust RESTful APIs that serve data and insights to DataWeave and other products
- Design user interaction workflows on our products and integrating them with data APIs
- Help stabilize and scale our existing systems. Help design the next generation systems.
- Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
- Work closely with the Head of Products and UX designers to understand the product vision and design philosophy
- Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and interns.
- Constantly think scale, think automation. Measure everything. Optimize proactively.
- Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.
Skills and Requirements
- 8- 15 years of experience building and scaling APIs and web applications.
- Experience building and managing large scale data/analytics systems.
- Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices.
- Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
- Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on.
- Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
- Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
- Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
- Use the command line like a pro. Be proficient in Git and other essential software development tools.
- Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
- Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
- Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies.
- Working knowledge linux server administration as well as the AWS ecosystem is desirable.
- It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.

