PriceLabs (https://www.chicagobusiness.com/innovators/what-if-you-could-adjust-prices-meet-demand" target="_blank">chicagobusiness.com/innovators/what-if-you-could-adjust-prices-meet-demand) is a cloud based software for vacation and short term rentals to help them dynamically manage prices just the way large hotels and airlines do! Our mission is to help small businesses in the travel and tourism industry by giving them access to advanced analytical systems that are often restricted to large companies.
We're looking for someone with strong analytical capabilities who wants to understand how our current architecture and algorithms work, and help us design and develop long lasting solutions to address those. Depending on the needs of the day, the role will come with a good mix of team-work, following our best practices, introducing us to industry best practices, independent thinking, and ownership of your work.
Responsibilities:
- Design, develop and enhance our pricing algorithms to enable new capabilities.
- Process, analyze, model, and visualize findings from our market level supply and demand data.
- Build and enhance internal and customer facing dashboards to better track metrics and trends that help customers use PriceLabs in a better way.
- Take ownership of product ideas and design discussions.
- Occasional travel to conferences to interact with prospective users and partners, and learn where the industry is headed.
Requirements:
- Bachelors, Masters or Ph. D. in Operations Research, Industrial Engineering, Statistics, Computer Science or other quantitative/engineering fields.
- Strong understanding of analysis of algorithms, data structures and statistics.
- Solid programming experience. Including being able to quickly prototype an idea and test it out.
- Strong communication skills, including the ability and willingness to explain complicated algorithms and concepts in simple terms.
- Experience with relational databases and strong knowledge of SQL.
- Experience building data heavy analytical models in the travel industry.
- Experience in the vacation rental industry.
- Experience developing dynamic pricing models.
- Prior experience working at a fast paced environment.
- Willingness to wear many hats.
Similar jobs
Job Title -Data Scientist
Job Duties
- Data Scientist responsibilities includes planning projects and building analytics models.
- You should have a strong problem-solving ability and a knack for statistical analysis.
- If you're also able to align our data products with our business goals, we'd like to meet you. Your ultimate goal will be to help improve our products and business decisions by making the most out of our data.
Responsibilities
Own end-to-end business problems and metrics, build and implement ML solutions using cutting-edge technology.
Create scalable solutions to business problems using statistical techniques, machine learning, and NLP.
Design, experiment and evaluate highly innovative models for predictive learning
Work closely with software engineering teams to drive real-time model experiments, implementations, and new feature creations
Establish scalable, efficient, and automated processes for large-scale data analysis, model development, deployment, experimentation, and evaluation.
Research and implement novel machine learning and statistical approaches.
Requirements
2-5 years of experience in data science.
In-depth understanding of modern machine learning techniques and their mathematical underpinnings.
Demonstrated ability to build PoCs for complex, ambiguous problems and scale them up.
Strong programming skills (Python, Java)
High proficiency in at least one of the following broad areas: machine learning, statistical modelling/inference, information retrieval, data mining, NLP
Experience with SQL and NoSQL databases
Strong organizational and leadership skills
Excellent communication skills
Brief:
As a BI Developer at GradRight, you’ll be working with Tableau and supporting data sources to build reports for the requirements of various business teams.
Responsibilities:
- Translate business needs to technical specifications for reports and dashboards
- Design, build and deploy BI solutions
- Maintain and support data analytics platforms (e.g. Tableau, Mixpanel, Google Analytics, etc)
- Evaluate and improve existing BI systems
- Collaborate with teams to integrate systems
- Develop and execute database queries, conduct analysis and prepare data to be shared with respective stakeholders
- Create visualizations and reports for requested projects
- Develop and update technical documentation around reports
Requirements:
- At least 3 years of proven experience as a BI Developer
- Experience at a startup
- Background in data warehouse design (e.g. dimensional modeling) and data mining
- In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework
- Working knowledge of Tableau
- Knowledge of SQL queries and MongoDB
- Proven abilities to take initiative and be innovative
- Analytical mind with a problem-solving aptitude
- You're proficient in AI/Machine learning latest technologies
- You're proficient in GPT-3 based algorithms
- You have a passion for writing code as well as understanding and crafting the ways systems interact
- You believe in the benefits of agile processes and shipping code often
- You are pragmatic and work to coalesce requirements into reasonable solutions that provide value
Responsibilities
- Deploy well-tested, maintainable and scalable software solutions
- Take end-to-end ownership of the technology stack and product
- Collaborate with other engineers to architect scalable technical solutions
- Embrace and improve our standards and processes to reduce friction and unlock efficiency
Current Ecosystem :
ShibaSwap : https://shibaswap.com/#/" target="_blank">https://shibaswap.com/#/
Metaverse : https://shib.io/#/" target="_blank">https://shib.io/#/
NFTs : https://opensea.io/collection/theshiboshis" target="_blank">https://opensea.io/collection/theshiboshis
Game : Shiba Eternity on iOS and Android
About Us
Railofy aims at solving one of the biggest challenges of Indian domestic travellers – Waitlist problem in Indian Railways.
We are a small team of technologists who have made a significant breakthrough progress in this field using AI, so that no waitlisted railway passenger has to ever miss a trip again.
We are backed by some of the leading VC investors in India (learn more about us on railofy.com)
About the role-
*Must be train smart* (Indian Railways)
As a Data Analyst, you will work with the database and data-science team in managing and updating the ever changing database related to trains, airports and buses etc.
You will also play a key role in monitoring the key KPIs.
Needless to say, you will be exposed to almost all the verticals of a tech startup and play a key role in solving real life problems.
Key Requirements
- Minimum 1 year of experience managing databases, MIS reporting or analytics (non-technical)
- Should be ‘ Train Smart ‘. Knowledge of Indian Railways and trains is required
- Hands-on with MS excel and google sheets
- Access to laptop & internet to work from home
- Willing to relocate to Mumbai
- Graduation (completed) from reputed school / college (any field)
- Good problem-solving skills and ability to think strategically
- Motivation to join an early stage startup and should go beyond compensation
- Should be always open for new learnings and responsibilities
What we look for:
We are looking for an associate who will be doing data crunching from various sources and finding the key points from the data. Also help us to improve/build new pipelines as per the requests. Also, this associate will be helping us to visualize the data if required and find flaws in our existing algorithms.
Responsibilities:
- Work with multiple stakeholders to gather the requirements of data or analysis and take action on them.
- Write new data pipelines and maintain the existing pipelines.
- Person will be gathering data from various DB’s and will be finding the required metrics out of it.
Required Skills:
- Experience with python and Libraries like Pandas,and Numpy.
- Experience in SQL and understanding of NoSQL DB’s.
- Hands-on experience in Data engineering.
- Must have good analytical skills and knowledge of statistics.
- Understanding of Data Science concepts.
- Bachelor degree in Computer Science or related field.
- Problem-solving skills and ability to work under pressure.
Nice to have:
- Experience in MongoDB or any NoSql DB.
- Experience in ElasticSearch.
- Knowledge of Tableau, Power BI or any other visualization tool.
SpringML is looking to hire a top-notch Senior Data Engineer who is passionate about working with data and using the latest distributed framework to process large dataset. As an Associate Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets. In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
RESPONSIBILITIES:
- Ability to work as a member of a team assigned to design and implement data integration solutions.
- Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
- Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
- Propose design solutions and recommend best practices for large scale data analysis
SKILLS:
- B.tech degree in computer science, mathematics or other relevant fields.
- 4+years of experience in ETL, Data Warehouse, Visualization and building data pipelines.
- Strong Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.
- Proficient in big data/distributed computing frameworks such as Apache,Spark, Kafka,
- Experience with Agile implementation methodologies