- We are looking for : Data engineer
N.p - 15 days to 30 Days
Location : Bangalore / Noida
About Amagi Media Labs
Should have Passion to learn and adapt new technologies, understanding,
solving/troubleshooting issues and risks, able to make informed decisions and ability to
lead the projects.
- 2-5 Years’ Experience with functional programming
- Experience with functional programming using Scala with Spark framework.
- Strong understanding of Object-oriented programming, data structures and algorithms
- Good experience in any of the cloud platforms (Azure, AWS, GCP) etc.,
- Experience with distributed (multi-tiered) systems, relational databases and NoSql storage solutions
- Desire to learn new technologies and languages
- Participation in software design, development, and code reviews
- High level of proficiency with Computer Science/Software Engineering knowledge and contribution to the technical skills growth of other team members
- Design, build and configure applications to meet business process and application requirements
- Proactively identify and communicate potential issues and concerns and recommend/implement alternative solutions as appropriate.
- Troubleshooting & Optimization of existing solution
Provide advice on technical design to ensure solutions are forward looking and flexible for potential future requirements and business needs.
- If you are not serious about joining us, the Interview process shouldn't be a waste of time.
- Want to be a Power Bi, Qlik, or Tableau only developer.
- A machine learning aspirant
- A data scientist
- Wanting to write only Python scripts
- Want to do AI
- Want to do 'BIG' data
- Want to do HADOOP
- Fresh Graduate
- Write SQL and Python for complicated analytical queries.
- Understand existing business problems of the client and map their needs to the schema that they have.
- Can neatly disassemble the problem into components and solve the needs by using SQL.
- Have worked on existing BI products.
- Develop solutions with our exciting new BI product for our clients.
- You should be very experienced and comfortable with writing SQL against very complicated schema to help answer business questions.
- Have an analytical thought process.
● Our Infrastructure team is looking for an excellent Big Data Engineer to join a core group that
designs the industry’s leading Micro-Engagement Platform. This role involves design and
implementation of architectures and frameworks of big data for industry’s leading intelligent
workflow automation platform. As a specialist in Ushur Engineering team, your responsibilities will
● Use your in-depth understanding to architect and optimize databases and data ingestion pipelines
● Develop HA strategies, including replica sets and sharding to for highly available clusters
● Recommend and implement solutions to improve performance, resource consumption, and
● On an ongoing basis, identify bottlenecks in databases in development and production
environments and propose solutions
● Help DevOps team with your deep knowledge in the area of database performance, scaling,
tuning, migration & version upgrades
● Provide verifiable technical solutions to support operations at scale and with high availability
● Recommend appropriate data processing toolset and big data ecosystems to adopt
● Design and scale databases and pipelines across multiple physical locations on cloud
● Conduct Root-cause analysis of data issues
● Be self-driven, constantly research and suggest latest technologies
The experience you need:
● Engineering degree in Computer Science or related field
● 10+ years of experience working with databases, most of which should have been around
● Expertise in implementing and maintaining distributed, Big data pipelines and ETL
● Solid experience in one of the following cloud-native data platforms (AWS Redshift/ Google
● Exposure to real time processing techniques like Apache Kafka and CDC tools
(Debezium, Qlik Replicate)
● Strong experience in Linux Operating System
● Solid knowledge of database concepts, MongoDB, SQL, and NoSql internals
● Experience with backup and recovery for production and non-production environments
● Experience in security principles and its implementation
● Exceptionally passionate about always keeping the product quality bar at an extremely
● Proficient with one or more of Python/Node.Js/Java/similar languages
Why you want to Work with Us:
● Great Company Culture. We pride ourselves on having a values-based culture that
is welcoming, intentional, and respectful. Our internal NPS of over 65 speaks for
itself - employees recommend Ushur as a great place to work!
● Bring your whole self to work. We are focused on building a diverse culture, with
innovative ideas where you and your ideas are valued. We are a start-up and know
that every person has a significant impact!
● Rest and Relaxation. 13 Paid leaves, wellness Fridays offs (aka a day off to care
for yourself- every last Friday of the month), 12 paid sick Leaves, and more!
● Health Benefits. Preventive health checkups, Medical Insurance covering the
dependents, wellness sessions, and health talks at the office
● Keep learning. One of our core values is Growth Mindset - we believe in lifelong
learning. Certification courses are reimbursed. Ushur Community offers wide
resources for our employees to learn and grow.
● Flexible Work. In-office or hybrid working model, depending on position and
location. We seek to create an environment for all our employees where they can
thrive in both their profession and personal life.
- Responsible for implementation and ongoing administration of Hadoop
- Aligning with the systems engineering team to propose and deploy new
hardware and software environments required for Hadoop and to expand existing
- Working with data delivery teams to setup new Hadoop users. This job includes
setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig
and MapReduce access for the new users.
- Cluster maintenance as well as creation and removal of nodes using tools like
Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools
- Performance tuning of Hadoop clusters and Hadoop MapReduce routines
- Screen Hadoop cluster job performances and capacity planning
- Monitor Hadoop cluster connectivity and security
- Manage and review Hadoop log files.
- File system management and monitoring.
- Diligently teaming with the infrastructure, network, database, application and
business intelligence teams to guarantee high data quality and availability
- Collaboration with application teams to install operating system and Hadoop
updates, patches, version upgrades when required.
READ MORE OF THE JOB DESCRIPTION
- Bachelors Degree in Information Technology, Computer Science or other
- General operational expertise such as good troubleshooting skills,
understanding of systems capacity, bottlenecks, basics of memory, CPU, OS,
storage, and networks.
- Hadoop skills like HBase, Hive, Pig, Mahout
- Ability to deploy Hadoop cluster, add and remove nodes, keep track of jobs,
monitor critical parts of the cluster, configure name node high availability, schedule
and configure it and take backups.
- Good knowledge of Linux as Hadoop runs on Linux.
- Familiarity with open source configuration management and deployment tools
such as Puppet or Chef and Linux scripting.
Nice to Have
- Knowledge of Troubleshooting Core Java Applications is a plus.
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
- Design AWS data ingestion frameworks and pipelines based on the specific needs driven by the Product Owners and user stories…
- Experience building Data Lake using AWS and Hands-on experience in S3, EKS, ECS, AWS Glue, AWS KMS, AWS Firehose, EMR
- Experience Apache Spark Programming with Databricks
- Experience working on NoSQL Databases such as Cassandra, HBase, and Elastic Search
- Hands on experience with leveraging CI/CD to rapidly build & test application code
- Expertise in Data governance and Data Quality
- Experience working with PCI Data and working with data scientists is a plus
- At least 4+ years of experience in the following Big Data frameworks: File Format (Parquet, AVRO, ORC), Resource Management, Distributed Processing and RDBMS
- 5+ years of experience on designing and developing Data Pipelines for Data Ingestion or Transformation using AWS technologies
We are a nascent quantitative hedge fund led by an MIT PhD and Math Olympiad medallist, offering opportunities to grow with us as we build out the team. Our fund has world class investors and big data experts as part of the GP, top-notch ML experts as advisers to the fund, plus has equity funding to grow the team, license data and scale the data processing.
We are interested in researching and taking in live a variety of quantitative strategies based on historic and live market data, alternative datasets, social media data (both audio and video) and stock fundamental data.
You would join, and, if qualified, lead a growing team of data scientists and researchers, and be responsible for a complete lifecycle of quantitative strategy implementation and trading.
- Atleast 3 years of relevant ML experience
- Graduation date : 2018 and earlier
- 3-5 years of experience in high level Python programming.
- Master Degree (or Phd) in quantitative disciplines such as Statistics, Mathematics, Physics, Computer Science in top universities.
- Good knowledge of applied and theoretical statistics, linear algebra and machine learning techniques.
- Ability to leverage financial and statistical insights to research, explore and harness a large collection of quantitative strategies and financial datasets in order to build strong predictive models.
- Should take ownership for the research, design, development and implementation of the strategy development and effectively communicate with other team mates
- Prior experience and good knowledge of lifecycle and pitfalls of algorithmic strategy development and modelling.
- Good practical knowledge in understanding financial statements, value investing, portfolio and risk management techniques.
- A proven ability to lead and drive innovation to solve challenges and road blocks in project completion.
- A valid Github profile with some activity in it
Bonus to have:
- Experience in storing and retrieving data from large and complex time series databases
- Very good practical knowledge on time-series modelling and forecasting (ARIMA, ARCH and Stochastic modelling)
- Prior experience in optimizing and back testing quantitative strategies, doing return and risk attribution, feature/factor evaluation.
- Knowledge of AWS/Cloud ecosystem is an added plus (EC2s, Lambda, EKS, Sagemaker etc.)
- Knowledge of REST APIs and data extracting and cleaning techniques
- Good to have experience in Pyspark or any other big data programming/parallel computing
- Familiarity with derivatives, knowledge in multiple asset classes along with Equities.
- Any progress towards CFA or FRM is a bonus
- Average tenure of atleast 1.5 years in a company
Punchh is the leader in customer loyalty, offer management, and AI solutions for offline and omni-channel merchants including restaurants, convenience stores, and retailers. Punchh brings the power of online to physical brands by delivering omni-channel experiences and personalization across the entire customer journey--from acquisition through loyalty and growth--to drive same store sales and customer lifetime value. Punchh uses best-in-class integrations to POS and other in-store systems such as WiFi, to deliver real-time SKU-level transaction visibility and offer provisioning for physical stores.
Punchh is growing exponentially, serves 200+ brands that encompass 91K+ stores globally. Punchh’s customers include the top convenience stores such as Casey’s General Stores, 25+ of the top 100 restaurant brands such as Papa John's, Little Caesars, Denny’s, Focus Brands (5 of 7 brands), and Yum! Brands (KFC, Pizza Hut, and Taco Bell), and retailers. For a multi-billion $ brand with 6K+ stores, Punchh drove a 3% lift in same-store sales within the first year. Punchh is powering loyalty programs for 135+ million consumers.
Punchh has raised $70 million from premier Silicon Valley investors including Sapphire Ventures and Adam Street Partners, has a seasoned leadership team with extensive experience in digital, marketing, CRM, and AI technologies as well as deep restaurant and retail industry expertise.
About the Role:
Punchh Tech India Pvt. is looking for a Senior Data Analyst – Business Insights to join our team. If you're excited to be part of a winning team, Punchh is a great place to grow your career.
This position is responsible for discovering the important trends among the complex data generated on Punchh platform, that have high business impact (influencing product features and roadmap). Creating hypotheses around these trends, validate them with statistical significance and make recommendations
Reporting to: Director, Analytics
Job Location: Jaipur
Experience Required: 4-6 years
What You’ll Do
- Take ownership of custom data analysis projects/requests and work closely with end users (both internal and external clients) to deliver the results
- Identify successful implementation/utilization of product features and contribute to the best-practices playbook for client facing teams (Customer Success)
- Strive towards building mini business intelligence products that add value to the client base
- Represent the company’s expertise in advanced analytics in a variety of media outlets such as client interactions, conferences, blogs, and interviews.
What You’ll Need
- Masters in business/behavioral economics/statistics with a strong interest in marketing technology
- Proven track record of at least 5 years uncovering business insights, especially related to Behavioral Economics and adding value to businesses
- Proficient in using the proper statistical and econometric approaches to establish the presence and strength of trends in data. Strong statistical knowledge is mandatory.
- Extensive prior exposure in causal inference studies, based on both longitudinal and latitudinal data.
- Excellent experience using Python (or R) to analyze data from extremely large or complex data sets
- Exceptional data querying skills (Snowflake/Redshift, Spark, Presto/Athena, to name a few)
- Ability to effectively articulate complex ideas in simple and effective presentations to diverse groups of stakeholders.
- Experience working with a visualization tool (preferably, but not restricted to Tableau)
- Domain expertise: extensive exposure to retail business, restaurant business or worked on loyalty programs and promotion/campaign effectiveness
- Should be self-organized and be able to proactively identify problems and propose solutions
- Gels well within and across teams, work with stakeholders from various functions such as Product, Customer Success, Implementations among others
- As the stakeholders on business side are based out of US, should be flexible to schedule meetings convenient to the West Coast timings
- Effective in working autonomously to get things done and taking the initiatives to anticipate needs of executive leadership
- Able and willing to relocate to Jaipur post pandemic.
- Medical Coverage, to keep you and your family healthy.
- Compensation that stacks up with other tech companies in your area.
- Paid vacation days and holidays to rest and relax.
- Healthy lunch provided daily to fuel you through your work.
- Opportunities for career growth and training support, including fun team building events.
- Flexibility and a comfortable work environment for you to feel your best.