11+ Statistical semantics Jobs in Bangalore (Bengaluru) | Statistical semantics Job openings in Bangalore (Bengaluru)
Apply to 11+ Statistical semantics Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Statistical semantics Job opportunities across top companies like Google, Amazon & Adobe.
- Understand the business drivers and analytical use-cases.
- Translate use cases to data models, descriptive, analytical, predictive, and engineering outcomes.
- Explore new technologies and learn new techniques to solve business problems creatively
- Think big! and drive the strategy for better data quality for the customers.
- Become the voice of business within engineering and of engineering within the business with customers.
- Collaborate with many teams - engineering and business, to build better data products and services
- Deliver the projects along with the team collaboratively and manage updates to customers on time
What we're looking for :
- Hands-on experience in data modeling, data visualization, and pipeline design and development
- Hands-on exposure to Machine learning concepts like supervised learning, unsupervised learning, RNN, DNN.
- Prior experience working with business stakeholders, in an enterprise space is a plus
- Great communication skills. You should be able to directly communicate with senior business leaders, embed yourself with business teams, and present solutions to business stakeholders
- Experience in working independently and driving projects end to end, strong analytical skills.
Primary skill set: QA Automation, Python, BDD, SQL
As Senior Data Quality Engineer you will:
- Evaluate product functionality and create test strategies and test cases to assess product quality.
- Work closely with the on-shore and the offshore team.
- Work on multiple reports validation against the databases by running medium to complex SQL queries.
- Better understanding of Automation Objects and Integrations across various platforms/applications etc.
- Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
- Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
- Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
- Establish processes and tools set to maintain automation scripts and generate regular test reports.
- Peer review to provide feedback and to make sure the test scripts are flaw-less.
Core/Must have skills:
- Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
- Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
- Clear & crisp communication and commitment towards deliverables
- Experience on BigData Testing will be an added advantage.
- Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.
Good to have skills:
- Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
- Ability to effectively articulate technical challenges and solutions
- Work experience in qTest, Jira, WebDriver.IO
- Experience comparable to DevOps SIRE providing SME-tevel application or platform support with responsibility for designing and automating operational procedures and best practices
-Experience writing python and shell scripts to perform health checks and automations
- Experience with Linux System Administration (preferably Red Hat)
- Hands-on experience with multi-tenant hosting environments for middleware applications (for example: centrally managed platform or infrastructure as a service)
- Experience with implementing observabitity, monitoring and alerting tools
- Excellent written and oral English communication skills. The candidate must write user-facing documentation, prepare and deliver presentations to an internal audience and effectively interact with upper management, colleagues, and customers
- Independent problem-solving skills, self-motivated, and a mindset for taking ownership
- A minimum of 5 years of infrastructure production support or DevOps experience
Additional Technical Skills
Experience with broker-based messaging infrastructure such as Apache Kafka, IBM MQ (or similar technology like ActiveMQ, Azure Service Bus) including configuration and performance tuning
Experience with public/private cloud and containerization technologies (e.g. Kubernetes)
Experience with Agile development methodology, Cl/CD and automated build pipelines
Experience with DevOps methodology (e.g. Phoenix Project)
Experience with tools such as Jira, Confluence and ServiceNow
Experience working with JSON, XML, Google Protocol Buffers, Avro, FIX
Experience with troubleshooting tools such as TCPdump and Wireshark
Experience with NoSQL databases such as MongoDB and Redis interest and understanding of emerging IT trends
Experience with system architecture design
Magento Developers
Positions: 4
Magento Developer Responsibilities:
- Meeting with the design team to discuss the needs of the company.
- Custom Module Development in Magento 1.x and Magento 2.x
- Automated Product Import Export in Magento 1.x and Magento 2.x
- Coding of the Magento templates.
- Developing Magento modules in PHP using best practices.
- Designing themes and interfaces.
- Setting performance tasks and goals.
- Troubleshooting integration issues.
- Updating website features and security patches.
Magento Developer Requirements:
- Bachelor’s degree in computer science or related field.
- Advanced knowledge of Magento, JavaScript, HTML, PHP, CSS, and MySQL.
- Experience with complete eCommerce lifecycle development.
- Understanding of modern UI/UX trends.
- Strong attention to detail.
- Ability to project-manage and work to strict deadlines.
- Ability to work in a team environment.
1. ROLE AND RESPONSIBILITIES
1.1. Implement next generation intelligent data platform solutions that help build high performance distributed systems.
1.2. Proactively diagnose problems and envisage long term life of the product focusing on reusable, extensible components.
1.3. Ensure agile delivery processes.
1.4. Work collaboratively with stake holders including product and engineering teams.
1.5. Build best-practices in the engineering team.
2. PRIMARY SKILL REQUIRED
2.1. Having a 2-6 years of core software product development experience.
2.2. Experience of working with data-intensive projects, with a variety of technology stacks including different programming languages (Java,
Python, Scala)
2.3. Experience in building infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data
sources to support other teams to run pipelines/jobs/reports etc.
2.4. Experience in Open-source stack
2.5. Experiences of working with RDBMS databases, NoSQL Databases
2.6. Knowledge of enterprise data lakes, data analytics, reporting, in-memory data handling, etc.
2.7. Have core computer science academic background
2.8. Aspire to continue to pursue career in technical stream
3. Optional Skill Required:
3.1. Understanding of Big Data technologies and Machine learning/Deep learning
3.2. Understanding of diverse set of databases like MongoDB, Cassandra, Redshift, Postgres, etc.
3.3. Understanding of Cloud Platform: AWS, Azure, GCP, etc.
3.4. Experience in BFSI domain is a plus.
4. PREFERRED SKILLS
4.1. A Startup mentality: comfort with ambiguity, a willingness to test, learn and improve rapidl
Java Developer Responsibilities:
- Designing and implementing Java-based applications.
- Analyzing user requirements to inform application design.
- Defining application objectives and functionality.
- Aligning application design with business goals.
- Developing and testing software.
- Debugging and resolving technical problems that arise.
- Producing detailed design documentation.
- Recommending changes to existing Java infrastructure.
- Developing multimedia applications.
- Developing documentation to assist users.
- Ensuring continuous professional self-development.
Java Developer Requirements:
- Degree in Computer Science or related field.
- Experience with user interface design, database structures, and statistical analyses.
- Analytical mindset and good problem-solving skills.
- Excellent written and verbal communication.
- Good organizational skills.
- Ability to work as part of a team.
- Attention to detail.
Job description
- Design and develop large-scale business application using Java, Spring boot, Microservices Architecture
- Design and develop software application code by analyzing requirements and specification using Java and J2EE
- Creating webservices (SOAP/RESTful) and consuming webservices
- Strong fundamentals OOPS concepts, Exception Handling, Coding Standards
- Experience in MySQL/MSSQL/Oracle
- Experience in SDLC methodologies Agile / waterfall
- Good understanding of data structures and algorithms
- Basic working knowledge of Unix/Linux
- Must possess strong problem solving and troubleshooting skills
- Excellent team player with strong verbal & written communication skills.
JD for IOT DE:
The role requires experience in Azure core technologies – IoT Hub/ Event Hub, Stream Analytics, IoT Central, Azure Data Lake Storage, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight / Databricks, SQL data warehouse.
You Have:
- Minimum 2 years of software development experience
- Minimum 2 years of experience in IoT/streaming data pipelines solution development
- Bachelor's and/or Master’s degree in computer science
- Strong Consulting skills in data management including data governance, data quality, security, data integration, processing, and provisioning
- Delivered data management projects with real-time/near real-time data insights delivery on Azure Cloud
- Translated complex analytical requirements into the technical design including data models, ETLs, and Dashboards / Reports
- Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
- Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
- Successfully delivered large scale IOT data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
- Experience in handling telemetry data with Spark Streaming, Kafka, Flink, Scala, Pyspark, Spark SQL.
- Hands-on experience on containers and Dockers
- Exposure to streaming protocols like MQTT and AMQP
- Knowledge of OT network protocols like OPC UA, CAN Bus, and similar protocols
- Strong knowledge of continuous integration, static code analysis, and test-driven development
- Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
- Must have excellent analytical and problem-solving skills
- Delivered change management initiatives focused on driving data platforms adoption across the enterprise
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
Roles & Responsibilities
You Will:
- Translate functional requirements into technical design
- Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core Azure services needed to fulfill the technical design
- Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
- Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
- Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapse, or SQL
- Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
- Automate core activities to minimize the delivery lead times and improve the overall quality
- Optimize platform cost by selecting the right platform services and architecting the solution in a cost-effective manner
- Deploy Azure DevOps and CI CD processes
- Deploy logging and monitoring across the different integration points for critical alerts
We are looking for an automation specialist who will play a key role in Sattva's digitisation
initiatives. Our rapid growth in the last year has underscored the importance of
technology-driven solutions to manage business processes at scale.
Currently our tech landscape is a collection of best-of-breed SaaS solutions that need to be
integrated/extended based on business needs. This role involves identifying automation
opportunities and realising them through low/no-code platforms like AppSheet, Zapier, etc. It is a technical role that also involves interfacing with people across different Business Units within Sattva. It offers the opportunity to work with best-in-class SaaS solutions like Google Workspace, FreshTeams, ClickUp, and QuickBooks.
Responsibilities
● Analyse existing landscape of SaaS solutions to identify automation gaps in key
business process
● Integrate best-of-breed SaaS solutions using APIs and Low/No-Code tools
● Build apps to extend existing SaaS solutions like FreshTeams, QuickBooks, ClickUp, etc
using available APIs and SDKs
● Configure SaaS solutions to meet the needs of a specific Business Unit or of a defined
security policy
● Build Slack apps to integrate with SaaS solutions in the landscape
● Troubleshoot technical issues with the configured solutions in the landscape
Ideal Candidate Profile
● 1+ years of experience in integrating/extending SaaS solutions
● Solid expertise in developing automation scripts and applications using Javascript or
Python
● Strong problem-solving ability
● Excellent communication skills
● Proven ability to interface with multiple stakeholders across business vertical
We are looking for candidates who are comfortable with -
- P&L preparation
- MIS Reporting
- GST and Taxation
- Income Tax
- Vendor Management
- Reconciliation
- Journal Entries
- Invoicing
- Should be comfortable with Tally ERP and Advanced Excel
Requirements:
- B Com or M Com with a specialisation in Accounting




