

Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

About Mobile Programming LLC
About
Mobile programming LLC is a US based MNC company. Our services are focused on Mobility Solutions, Custom App Development, Cloud Solutions, IoT, AR/VR, Blockchain, Artificial Intelligence & Machine Learning, Predictive Analytics & Big Data Solutions, and several other trending as well as emerging technologies.
You can find more details on - https://www.mobileprogramming.com/
Company video


Connect with the team
Company social profiles
Similar jobs


Candidate must have Experience with Language React, Angular,MobX, and SCSS .
Candidate should have Hands-on experience working with RDBMS
Candidate should have developed from scratch.
Candidate should be more stronger in Angular.
Candidate from service based companies will be preferred.
Lightning Job By Cutshort ⚡
As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)
Job Overview:
We are seeking an experienced DevOps Engineer to join our team. The successful candidate will be responsible for designing, implementing, and maintaining the infrastructure and software systems required to support our development and production environments. The ideal candidate should have a strong background in Linux, GitHub, Actions/Jenkins, ArgoCD, AWS, Kubernetes, Helm, Datadog, MongoDB, Envoy Proxy, Cert-Manager, Terraform, ELK, Cloudflare, and BigRock.
Responsibilities:
• Design, implement and maintain CI/CD pipelines using GitHub, Actions/Jenkins, Kubernetes, Helm, and ArgoCD.
• Deploy and manage Kubernetes clusters using AWS.
• Configure and maintain Envoy Proxy and Cert-Manager to automate deployment and manage application environments.
• Monitor system performance using Datadog, ELK, and Cloudflare tools.
• Automate infrastructure management and maintenance tasks using Terraform, Ansible, or similar tools.
• Collaborate with development teams to design, implement and test infrastructure changes.
• Troubleshoot and resolve infrastructure issues as they arise.
• Participate in on-call rotation and provide support for production issues.
Qualifications:
• Bachelor's or Master's degree in Computer Science, Engineering or a related field.
• 4+ years of experience in DevOps engineering with a focus on Linux, GitHub, Actions/CodeFresh, ArgoCD, AWS, Kubernetes, Helm, Datadog, MongoDB, Envoy Proxy, Cert-Manager, Terraform, ELK, Cloudflare, and BigRock.
• Strong understanding of Linux administration and shell scripting.
• Experience with automation tools such as Terraform, Ansible, or similar.
• Ability to write infrastructure as code using tools such as Terraform, Ansible, or similar.
• Experience with container orchestration platforms such as Kubernetes.
• Familiarity with container technologies such as Docker.
• Experience with cloud providers such as AWS.
• Experience with monitoring tools such as Datadog and ELK.
Skills:
• Strong analytical and problem-solving skills.
• Excellent communication and collaboration skills.
• Ability to work independently or in a team environment.
• Strong attention to detail.
• Ability to learn and apply new technologies quickly.
• Ability to work in a fast-paced and dynamic environment.
• Strong understanding of DevOps principles and methodologies.
Kindly apply at https://www.wohlig.com/careers
- Design, develop, and maintain Java-based applications using Spring Boot and Hibernate frameworks.
- Collaborate with cross-functional teams to gather and analyze requirements, and translate them into technical solutions.
- Implement microservices architecture to build scalable and resilient systems.
- Write clean, efficient, and maintainable code adhering to best practices and coding standards.
- Conduct code reviews and provide constructive feedback to team members.
- Troubleshoot and debug issues to ensure optimal performance and reliability.
- Stay updated with the latest trends and technologies in Java development, microservices, and cloud computing.
SEO,Digital Marketing,Display Advertising,SEM,Google Analytics,Google AdWords,Online Marketing,SMO,Webtrends,Digital Campaigns,Social Media,Social Media Manager,Facebook Marketing,facebook sales,Graphic Designing,Email Marketing
Job Description :
Fluenct is English
Proven track records
Strategic digital acumen & ability to execute flawlessly
Development & Maintenance of digital channels (website , email , social media ) including online content distribution & social listening
Planning digital media campaigns , including web SEO / SEM , email , social media & display advertising
Guard the brand guidelines, identity & usage in the social media space across various mediums
Develop consumer marketing strategies to successfully establish our brand/s ; digital awareness , growth & loyalty
Responsible for consumer insights for creating the best in class digital campaigns
Execute all facets of digital marketing strategy from paid media to social engagement to launch activations
Content marketing strategies must have experience in storytelling campaigns
Drive organic acquisitions of consumers via SEO , SEM programs
Drive digital audience understanding & profiling exercises
Evaluate , develop , manage , test & implement Ecommerce initiatives
Skills & Competencies:
Minimum 3 years experience in Digital Marketing.
Analytical & Data oriented marketing focus to drive results
Hands on digital expertise in SEO , SEM , Social & Organic Marketing programs
Exposure & understanding of content business will be an advantage


Skill required:)
- Expert in writing clean, test-driven, easily maintainable, and modular code.
- Having a good working knowledge of database design and maintenance.
- Able to build reusable code and libraries for future use.
- Knowing troubleshooting and debugging applications is essential.
- Should have great performance in UI or backend tests to optimize performance.
- Capable of designing and developing web applications for the PHP platform (Laravel)
- Preferred to collaborate with cross-functional teams to define, design, and ship new features.
- Able to ensure the performance, quality, and responsiveness of web applications.
- Having sound working knowledge of identifying and correcting bottlenecks and fixing bugs.
- Able to work on bug fixing and improving application performance
- Able to constantly discover, evaluate, and implement new technologies to maximize development efficiency.
Experience – 8 – 10 years
Location - NCR
Roles and Responsibilities:
System Analyst
The individual in this role will gather, document, and analyze client functional, transactional, and insurance business requirements across all insurance functions and third-party integrations. System Analyst will also work within a cross-functional project team to provide business analytical support and leadership from the business side. The individual will play a highly visible, client-facing, and the consultative role and have the ability to offer system solutions to enhance client implementations and transform client workflow and business processes. Individual should be very good in mapping business functions / attributes with the Insurance rules and data.
Skills Required:
- A successful candidate in this role must:
- Good hands-on skills in Oracle, PL/SQL, TSQL
- Object oriented knowledge should be good
- Have Functional knowledge on P & C
- Overall Tech - 60% and Functional - 40%
Primary Skills:
- Good Data Mapping knowledge, RDBMS / SQL Knowledge
Secondary Skills:
- Oracle Big+



About LodgIQ
LodgIQ is led by a team of experienced hospitality technology experts, data scientists and product domain experts. Seed funded by Highgate Ventures, a venture capital platform focused on early stage technology investments in the hospitality industry and Trilantic Capital Partners, a global private equity firm, LodgIQ has made a significant investment in advanced machine learning platforms and data science.
Title : Data Scientist
Job Description:
- Apply Data Science and Machine Learning to a REAL-LIFE problem - “Predict Guest Arrivals and Determine Best Prices for Hotels”
- Apply advanced analytics in a BIG Data Environment – AWS, MongoDB, SKLearn
- Help scale up the product in a global offering across 100+ global markets
Qualifications:
- Minimum 3 years of experience with advanced data analytic techniques, including data mining, machine learning, statistical analysis, and optimization. Student projects are acceptable.
- At least 1 year of experience with Python / Numpy / Pandas / Scipy/ MatPlotLib / Scikit-Learn
- Experience in working with massive data sets, including structured and unstructured with at least 1 prior engagement involving data gathering, data cleaning, data mining, and data visualization
- Solid grasp over optimization techniques
- Master's or PhD degree in Business Analytics. Data science, Statistics or Mathematics
- Ability to show a track record of solving large, complex problems
- Design and develop highly scalable, reliable and fault-tolerant systems for one of India's fastest-growing product startup
- Translate business requirements into scalable and extensible design.
- Pair with team members on functional and non-functional requirements and spread design philosophy and goals across the team.
- Partner with the product management team to define and execute the feature roadmap.
- Coordinate with cross-functional teams (Mobile, DevOps, Data, UX, QA, etc.) on planning and execution.
- Continuously improve code quality, product execution, and customer delight.
- Proactively manage stakeholder communication related to deliverables, risks, charges, and dependencies.
- Communicate, collaborate and work effectively across distributed teams
- An innate desire to deliver and a strong sense of accountability for your work.
- Willingness to learn new languages and methodologies.
- Passion for learning new things, solving challenging problems.
Requirements:
- 3+ years of experience in software development or a serious open source track record
- 2+ years of hands-on experience in designing, developing, testing and deploying large scale applications in any language or stack
- Proficiency in OOP concepts, including design patterns. Experience with functional programming is a plus.
- Data modelling experience in both Relational and NoSQL databases.
- Ability to understand and implement Continuous Integration and Continuous Delivery.
- Well versed with Lean methodologies, and Test Engineering and Automation.- Ability to design and implement low latency RESTful services.
- Experience in troubleshooting server performance memory issues, GC tuning, resource leaks, etc.
- Continuously refactor applications and architectures to maintain high-quality levels.
- Ability to scope, review and refine user stories for technical completeness and to alleviate dependency risks.
- We primarily use Node JS, MongoDB, AngularJS & AWS
**About VComply :**
- VComply is a fast-growing SaaS GRC (Governance, Risk and Compliance) platform that helps organizations manage their compliance controls and risk in an intuitive manner. VComply brings a contextual understanding of GRC to compliance officers enabling them to achieve organization objectives while maintaining the integrity and managing risk.
- VComply was founded in 2019 and is backed by Accel. We already have 25+ marquee customers across North America and Australia & New Zealand in our short existence. We provide organizations with configurable, yet plug and play GRC applications that need zero development effort and infrastructure.
**Why VComply :**
- Opportunity to have direct impact on the growth of a fast growing & dynamic company
- Challenging work environment with End- to-end ownership of your role
- Incredible growth opportunities - we are a meritocracy, and the best will always have room for growth in the company
- Opportunity to work directly with senior leadership
- Accelerated learning environment with a high performing team surrounding you
- Culture that promotes challenging oneself & continuous improvement.
- Perks & Employee friendly policies.
Primary responsibilities:
- Architect, Design and Build high performance Search systems for personalization, optimization, and targeting
- Designing systems with Solr, Akka, Cassandra, Kafka
- Algorithmic development with primary focus Machine Learning
- Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments
- Participation in design and code reviews and recommend improvements
- Unit testing with JUnit, Performance testing and tuning
- Coordination with internal and external teams
- Mentoring junior engineers
- Participate in Product roadmap and Prioritization discussions and decisions
- Evangelize the solution with Professional services and Customer Success teams
Minimum requirements:
- Tech/M.Tech in computer Engineering or related fields or MCA
- At least 10-14 years of software development experience
- Expert in Java, Scala or any other object oriented language
- Proficient in SQL concepts (HiveQL or Postgres a plus)
- Additional language skills for scripting and rapid application development
Desired skills and experience:
- Working with large data sets in the PBs
- Familiarity with UNIX (systems skills a plus)
- Working experience in Solr, Cassandra, and Kafka
- Working in a distributed environment and has dealt with challenges around scaling and performance
- Working with distributed teams across multiple locations


