About Wibmo
Similar jobs
point on the app. It is an app built in India, for Indians, to share their thoughts and opinions on any topic in their
mother tongue using Audio / Video / Text. Koo gives a stage for Indians to voice their thoughts and enables the “Best
of India” to share their thoughts with the “Rest of India”. Koo provides a personalised feed to users by letting people
select the people they want to follow. The user is then shown content from these people. Some of the top minds of
India are invited on our client to share their thoughts and opinions on various topics. Users can follow such personalities
and follow what they are talking about, and share their opinions too.
Koo boasts of some of the most prominent faces in India that comprise of Union Ministers, Chief Ministers of States,
Actors, Actresses, Celebrities, Sports Personalities, Top Bureaucrats, Journalists and Editors among others from
1000s of professions.
applications with desire to work in growing fast paced startups.
2. Worked on applications from scratch. Well versed with building apps for various screen density and sizes.
3. Up to date on upcoming trends in android and its ecosystem with respect to design, architecture etc.
4. Experience on optimising all important parameters like network utilization, application size and battery.
5. Experience with kotlin, coroutines, dagger, retrofit, rx-android and android architecture components etc.
6. Should have used mock frameworks, modular code design and popular design patterns.
applications with desire to work in growing fast paced startups.
2. Worked on applications from scratch. Well versed with building apps for various screen density and sizes.
3. Up to date on upcoming trends in android and its ecosystem with respect to design, architecture etc.
4. Experience on optimising all important parameters like network utilization, application size and battery.
5. Experience with kotlin, coroutines, dagger, retrofit, rx-android and android architecture components etc.
6. Should have used mock frameworks, modular code design and popular design patterns.
We at AURO MANUFACTURING PVT. LTD. is looking for a Recruiter.
Job role -
- End-to-end recruitment.
- Sourcing profiles through job portals & consultants.
- Hiring through job portals as well as through consultants.
- Screening profiles.
- Conducting interviews.
- Negotiating salary.
Skills -
- Good interpersonal skills.
- Good communication skills.
- Moderate Excel skills.
Education - Any Graduation (full-time)
Experience – 2-3Years (as a recruiter)
Job location - F P No. 37 / F P No. 65/B/2 Umiya Chowk,
Ashwinikumar Road, Varachha, Surat 395 008, India.
Skills We Require:- Dev Ops, AWS Admin, terraform, Infrastructure as a Code
SUMMARY:-
- Implement integrations requested by customers
- Deploy updates and fixes
- Provide Level 2 technical support
- Build tools to reduce occurrences of errors and improve customer experience
- Develop software to integrate with internal back-end systems
- Perform root cause analysis for production errors
- Investigate and resolve technical issues
- Develop scripts to automate visualization
- Design procedures for system troubleshooting and maintenance
Have good hands on experience on Dev Ops, AWS Admin, terraform, Infrastructure as a Code
Have knowledge on EC2, Lambda, S3, ELB, VPC, IAM, Cloud Watch, Centos, Server Hardening
Ability to understand business requirements and translate them into technical requirements
A knack for benchmarking and optimization8+ years of experience Excellent skills in C/C++/Java programming in embedded domain
Strong experience in the area of Automotive, Android Embedded development and Connectivity Technologies
Strong experience on Android architecture, HAL, BSP, customize driver in Android HAL, Android boot up sequence
Familiar with Android Build System and procedure and able to provide ideas to improve continuous build and integration
Experience in Hardware peripherals like Ethernet, CAN, DMA, I2C, SPI, UART, Hardware Accelerator, AFE , LCD, backlight, touchcontroller
Experience with design/ modification to Hardware Abstraction Libraries (HAL) for Android support of low-level device features.
Experience in debugging on a wide range of Linux Kernel drivers and Android framework HALs e.g. Audio/Video, USB, Bluetooth, WiFi
Fluent in industry standard software development tools: HSW/HE debuggers, code revision control systems (GIT, Perforce), IDEs and build tools
Exposure/experience in Automotive Embedded Software Development is a plus
Hi All,
We are hiring!!
Company: SpringML India Pvt Ltd.
Role:Lead Data Engineer
Location: Hyderabad
Website: https://springml.com/">https://springml.com/
About Company:
At SpringML, we are all about empowering the 'doers' in companies to make smarter decisions with their data. Our predictive analytics products and solutions apply machine learning to today's most pressing business problems so customers get insights they can trust to drive business growth.
We are a tight-knit, friendly team of passionate and driven people who are dedicated to learning, get excited to solve tough problems and like seeing results, fast. Our core values include placing our customers first, empathy and transparency, and innovation. We are a team with a focus on individual responsibility, rapid personal growth, and execution. If you share similar traits, we want you on our team.
What's the opportunity?
SpringML is looking to hire a top-notch Lead Data Engineer who is passionate about working with data and using the latest distributed framework to process large dataset.
As a Lead Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets.
In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
Responsibilities:
- Ability to work as a member of a team assigned to design and implement data integration solutions.
- Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
- Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
- Propose design solutions and recommend best practices for large scale data analysis
Skills:
- B.tech degree in computer science, mathematics or other relevant fields.
- 6+years of experience in ETL, Data Warehouse, Visualization and building data pipelines.
- Strong Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.
- Proficient in big data/distributed computing frameworks such as Apache Spark, Kafka,
- Experience with Agile implementation methodology
EXP:: 4 - 7 yrs
- Any scripting language:: Python, Scala, shell or bash
- Cloud:: AWS
- Database:: Relational (SQL) & non-relational (NoSQL)
- CI/CD tools and Version controlling
We are actively seeking a Senior Data Engineer experienced in building data pipelines and integrations from 3rd party data sources by writing custom automated ETL jobs using Python. The role will work in partnership with other members of the Business Analytics team to support the development and implementation of new and existing data warehouse solutions for our clients. This includes designing database import/export processes used to generate client data warehouse deliverables.
- 2+ Years experience as an ETL developer with strong data architecture knowledge around data warehousing concepts, SQL development and optimization, and operational support models.
- Experience using Python to automate ETL/Data Processes jobs.
- Design and develop ETL and data processing solutions using data integration tools, python scripts, and AWS / Azure / On-Premise Environment.
- Experience / Willingness to learn AWS Glue / AWS Data Pipeline / Azure Data Factory for Data Integration.
- Develop and create transformation queries, views, and stored procedures for ETL processes, and process automation.
- Document data mappings, data dictionaries, processes, programs, and solutions as per established standards for data governance.
- Work with the data analytics team to assess and troubleshoot potential data quality issues at key intake points such as validating control totals at intake and then upon transformation, and transparently build lessons learned into future data quality assessments
- Solid experience with data modeling, business logic, and RESTful APIs.
- Solid experience in the Linux environment.
- Experience with NoSQL / PostgreSQL preferred
- Experience working with databases such as MySQL, NoSQL, and Postgres, and enterprise-level connectivity experience (such as connecting over TLS and through proxies).
- Experience with NGINX and SSL.
- Performance tune data processes and SQL queries, and recommend and implement data process optimization and query tuning techniques.