Uasoftdev
At UASoftDev, we believe software development goes beyond writing code—it’s about transforming big ideas into impactful digital products that deliver real results. We partner with clients worldwide, offering full-cycle development, outsourcing, and outstaffing solutions customized to your needs.
Mission & Approach
Our mission is to bridge the gap between innovation and success. We provide flexible engagement models—whether you need a single expert or a complete team—to align with your goals. Through Agile methodologies, we adapt swiftly to changing requirements, ensuring transparency at every step.
-
· 57 views · 19 applications · 18d
SAP BTP Fullstack Developer with Python and UI5 knowledge
Full Remote · Countries of Europe or Ukraine · 5 years of experience · Upper-IntermediateSummary of Key Skills for the Role: Core: JavaScript, UI5/Fiori development, Python, SAP BTP, BAS, Git, Deployment. Nice to Have: ABAP/CDS, Data development, Debugging skills. Developer Resource Management Team JavaScript:Proficient in JavaScript...Summary of Key Skills for the Role:
- Core: JavaScript, UI5/Fiori development, Python, SAP BTP, BAS, Git, Deployment.
- Nice to Have: ABAP/CDS, Data development, Debugging skills.
Developer Resource Management Team
- JavaScript:
- Proficient in JavaScript (ES6+), with a strong understanding of its core concepts (such as closures, promises, async/await, and DOM manipulation).
- Familiarity with JavaScript frameworks and libraries (React, Angular, etc.) is a plus, but core JavaScript expertise is essential.
- UI5/Fiori App Development:
- Expertise in SAP UI5 (OpenUI5) for developing custom user interfaces and Fiori applications.
- Experience with Fiori design principles, components, and patterns.
- Familiarity with SAP Fiori Launchpad and SAPUI5 controls.
- Python:
- Strong proficiency in Python, including familiarity with libraries and frameworks (Django, Flask, Pandas, NumPy, etc.).
- Knowledge of integrating Python-based services with other systems and databases.
- Development Tools:
- BAS (Business Application Studio): Familiar with SAP’s Business Application Studio for building and deploying SAP Fiori/UI5 applications.
- Git: Proficient in Git for version control, including branching, merging, pull requests, and code reviews.
- SAP BTP (Business Technology Platform):
- Experience working with SAP BTP services such as SAP Integration Suite, SAP HANA Cloud, SAP AI services, etc.
- Knowledge of deploying and managing applications in the BTP environment, including security, authentication, and configuration.
- Application Deployment:
- Hands-on experience with deploying and maintaining applications in cloud environments.
- Familiarity with CI/CD pipelines for automating deployment processes.
- BTP Services:
- Understanding of SAP Cloud services, including SAP HANA, SAP Integration Suite, SAP Fiori, etc.
- Experience working with SAP BTP offerings and ensuring the proper configuration and performance of deployed applications.
- Communication Skills:
- Strong verbal and written communication skills, with the ability to explain technical concepts clearly to non-technical stakeholders and collaborate effectively with cross-functional teams.
Nice to Have:
- ABAP (Advanced Business Application Programming):
- Experience with ABAP, the SAP programming language, especially in the context of SAP development and customizations.
- CDS (Core Data Services):
- Understanding of Core Data Services (CDS) in SAP, which is used for defining semantically rich data models in the SAP HANA database.
- Data Development or Debugging Skills:
- Experience with database development or troubleshooting/debugging complex data models.
- Familiarity with SQL, debugging techniques, and optimizing queries in SAP environments.
-
· 52 views · 6 applications · 18d
Data Engineer with Knime experience
Full Remote · Countries of Europe or Ukraine · 5 years of experience · Upper-IntermediateWe are looking for a skilled Data Engineer with hands-on experience in KNIME to join our data team. You will be responsible for designing and maintaining scalable data pipelines and ETL workflows, integrating multiple data sources, and ensuring high data...We are looking for a skilled Data Engineer with hands-on experience in KNIME to join our data team. You will be responsible for designing and maintaining scalable data pipelines and ETL workflows, integrating multiple data sources, and ensuring high data quality and availability across the organization.
Key Responsibilities
- Design, develop, and manage ETL workflows using KNIME Analytics Platform
- Integrate and transform data from various sources (databases, APIs, flat files, etc.)
- Optimize data pipelines for performance and reliability
- Collaborate with data analysts, scientists, and business stakeholders
- Maintain clear documentation of data processes and pipelines
- Monitor and troubleshoot data quality issues
Requirements
- 2+ years of experience as a Data Engineer or similar role
- Solid hands-on experience with KNIME (workflow creation, integration, automation)
- Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL, MSSQL)
- Familiarity with Python or R for data manipulation (a plus)
- Understanding of ETL concepts and data warehousing principles
- Experience working with APIs, JSON, XML, Excel, and CSV data
- Good communication skills and ability to work in cross-functional teams
Nice to Have
- Experience with cloud platforms (AWS, Azure, GCP)
- Knowledge of other ETL tools (e.g., Talend, Alteryx, Apache NiFi)
- Basic knowledge of machine learning or business intelligence tools
What We Offer
- Competitive salary and performance bonuses
- Flexible working hours and remote work options
- Opportunities for professional growth and training
- Collaborative and supportive team culture
-
· 101 views · 17 applications · 17d
Senior Python Developer with LLM experience
Full Remote · Countries of Europe or Ukraine · 5 years of experience · Upper-IntermediateWhat We’re Looking For Strong proficiency in Python (5+ years experience) Practical experience with LLM frameworks/tools, including:Vector databases (especially Weaviate) LangChain, Crew-AI, or LlamaIndex Ability to work independently and integrate with...What We’re Looking For
- Strong proficiency in Python (5+ years experience)
- Practical experience with LLM frameworks/tools, including:
- Vector databases (especially Weaviate)
- LangChain, Crew-AI, or LlamaIndex
- Ability to work independently and integrate with existing dev teams
- Solid understanding of software architecture and modular code design
Nice to Have
- Experience deploying LLM-powered applications in production
- Familiarity with API development, containerization (Docker), or cloud platforms
- Understanding of prompt engineering and retrieval-augmented generation (RAG) concepts
-
· 48 views · 4 applications · 3d
Senior AWS DevOps with Python coding skills
Full Remote · Countries of Europe or Ukraine · 5 years of experience · Advanced/FluentPlease note that this potision requires commercial experience with coding on Python. You should have the knowledgle how to write little applications on Python. Wrtiting scripts is not enough for this position. DevOps should have Developer's mindset as...Please note that this potision requires commercial experience with coding on Python. You should have the knowledgle how to write little applications on Python. Wrtiting scripts is not enough for this position.
DevOps should have Developer's mindset as well and this position requires reading code will be a part of daily routine.
Please don't apply if you don't have commercial experience with Python Development, it's a mandatory requirement.
About the Role:We’re looking for a skilled DevOps/Infrastructure Engineer with strong experience in AWS, CI/CD, Big Data technologies, and infrastructure as code. You’ll work in a cross-functional scrum team, responsible for both software development and operational tasks. You’ll be expected to proactively contribute to system design, monitor operations, and collaborate closely with developers and stakeholders.
Requirements:
Core Skills:
- Proficient with AWS services:
S3, EBS, EC2, Lambda, EMR, Redshift, RDS, DynamoDB - Programming/Scripting languages:
Python, JavaScript, R, Matlab - CI/CD pipelines including unit and integration testing
- Experience with Big Data technologies (EMR, Spark, Redshift)
- Infrastructure as Code using Terraform
- Familiar with microservice and serverless architectures
- Solid understanding of containerized environments (Docker, Kubernetes)
- Confident in terminal work, including operations on remote servers
Infra & DevOps Stack:
- Comfortable with multiple AWS accounts and AWS CLI
- Jenkins, GitHub Actions & Action Runners
- Helm, Calico, K8s
- Monitoring & Logging: Grafana, Prometheus, Sentry
- Security & Scanning: Sops, Sonarqube, Trivy
- Automation & Tools: Renovate, Devbox, Makefiles, Shell scripting
- HashiCorp Packer
Key AWS Services in Use:
- EC2, VPC, Load Balancers
- S3, S3 Access Points
- Systems Manager Parameter Store
- Lambda, Step Functions
- SNS, SQS, CloudWatch
- DynamoDB, Redshift
- ECS, ECR, EKS
- KMS, ACM, WAF, IAM
Development Stack:
- Python and Serverless framework
- TypeScript, Vue.js
- Experience with:
Lambda, Step Functions, SQS, SNS, API Gateway, EventBridge, Parameter Store
Responsibilities:
Software Development:
- Develop and maintain services using modern programming languages
- Break down improvement ideas into actionable tasks
- Participate in agile processes within the scrum team
Operations:
- Participate in on-call rotation for service operations
- Monitor system alerts and respond accordingly
- Create and handle problem reports based on incident detection
Communication & Collaboration:
- Write clear, structured documentation for users and admins
- Work closely with both internal and external team members
- Support users and answer feature-related questions
- Proficient with AWS services: