Browsing by Author "Abbasi, Maryam"
Now showing 1 - 10 of 16
Results Per Page
Sort Options
- Adaptive and Scalable Database Management with Machine Learning Integration: A PostgreSQL Case StudyPublication . Abbasi, Maryam; Bernardo, Marco V.; Vaz, Paulo; Silva, José; Martins, Pedro; ANTUNES VAZ, PAULO JOAQUIM; Silva, JoséThe increasing complexity of managing modern database systems, particularly in terms of optimizing query performance for large datasets, presents significant challenges that traditional methods often fail to address. This paper proposes a comprehensive framework for integrating advanced machine learning (ML) models within the architecture of a database management system (DBMS), with a specific focus on PostgreSQL. Our approach leverages a combination of supervised and unsupervised learning techniques to predict query execution times, optimize performance, and dynamically manage workloads. Unlike existing solutions that address specific optimization tasks in isolation, our framework provides a unified platform that supports real-time model inference and automatic database configuration adjustments based on workload patterns. A key contribution of our work is the integration of ML capabilities directly into the DBMS engine, enabling seamless interaction between the ML models and the query optimization process. This integration allows for the automatic retraining of models and dynamic workload management, resulting in substantial improvements in both query response times and overall system throughput. Our evaluations using the Transaction Processing Performance Council Decision Support (TPC-DS) benchmark dataset at scale factors of 100 GB, 1 TB, and 10 TB demonstrate a reduction of up to 42% in query execution times and a 74% improvement in throughput compared with traditional approaches. Additionally, we address challenges such as potential conflicts in tuning recommendations and the performance overhead associated with ML integration, providing insights for future research directions. This study is motivated by the need for autonomous tuning mechanisms to manage large-scale, hetero geneous workloads while answering key research questions, such as the following: (1) How can machine learning models be integrated into a DBMS to improve query optimization and workload management? (2) What performance improvements can be achieved through dynamic configuration tuning based on real-time workload patterns? Our results suggest that the proposed framework significantly reduces the need for manual database administration while effectively adapting to evolving workloads, offering a robust solution for modern large-scale data environments.
- An Overview on Cloud Services for Human TrackingPublication . Martins, Manuel; Mota, David; Martins, Pedro; Abbasi, Maryam; Caldeira, FilipeThis paper reflects the intention to test the use of public cloud services to assess the presence of humans in a given space, more precisely, multiple stores, with the least effort and in the fastest way. It is also intended to demonstrate that the use of the public cloud can be an instrument of added value in business areas and research areas. In the specific case, the cloud was used to train and use artificial intelligence models.
- An overview on how to develop a low-code application using OutSystemsPublication . Martins, Ricardo; Caldeira, Filipe; Sá, Filipe; Abbasi, Maryam; Martins, PedroThe motivation for developing a self-service platform for employees arises precisely from the idea that in all organizations there are tasks that could be automated in order to redirect work resources to more important tasks. The proposed application consists of the development of a self-service platform, for personal information and scheduling tasks, aimed at the employees instead of all the solutions that are in the market that aim their platform to the Human Resources. We focus on the employers giving them more responsibility to make their own personal management like, change their personal info, book their vacations and other, giving to the Human Resources the tasks of managing all these actions made by the employers. At the end of the work, it is expected that the final solution to be considered as an example of success with regards to the theme of business automation and innovation, using the low-code application Outsystems to perform the full proposed application development.
- Cisco NFV on Red Hat OpenStack PlatformPublication . Oliveira, Luis; Martins, Pedro; Abbasi, Maryam; Caldeira, FilipeThe traditional telecom networks have been facing constant challenges to keep up with bandwidth growth, latency, data consumption and coverage. On top of it, there are also new use cases of telecom infrastructures usage such as the IoT exponential growth. The Network Function Virtualization (NFV) appears as the solution for the transition between high-cost dedicated hardware to low-cost commercial off-theshelf (COTS) servers. This transition will not only meet the requirements of the new telecom reality but also reduces the overall operational cost of the network. This document illustrates the implementation of Cisco Virtual Network Functions (VNFs) of a vEPC on top of Red Hat OpenStack Platform.
- Comprehensive Evaluation of Deepfake Detection Models: Accuracy, Generalization, and Resilience to Adversarial AttacksPublication . Abbasi, Maryam; ANTUNES VAZ, PAULO JOAQUIM; Silva, José; Martins, PedroThe rise of deepfakes—synthetic media generated using artificial intelli gence—threatens digital content authenticity, facilitating misinformation and manipu lation. However, deepfakes can also depict real or entirely fictitious individuals, leveraging state-of-the-art techniques such as generative adversarial networks (GANs) and emerging diffusion-based models. Existing detection methods face challenges with generalization across datasets and vulnerability to adversarial attacks. This study focuses on subsets of frames extracted from the DeepFake Detection Challenge (DFDC) and FaceForensics++ videos to evaluate three convolutional neural network architectures—XCeption, ResNet, and VGG16—for deepfake detection. Performance metrics include accuracy, precision, F1-score, AUC-ROC, and Matthews Correlation Coefficient (MCC), combined with an assessment of resilience to adversarial perturbations via the Fast Gradient Sign Method (FGSM). Among the tested models, XCeption achieves the highest accuracy (89.2% on DFDC), strong generalization, and real-time suitability, while VGG16 excels in precision and ResNet provides faster inference. However, all models exhibit reduced performance under adversarial conditions, underscoring the need for enhanced resilience. These find ings indicate that robust detection systems must consider advanced generative approaches, adversarial defenses, and cross-dataset adaptation to effectively counter evolving deep fake threats
- Data Privacy and Ethical Considerations in Database ManagementPublication . Pina, Eduardo; Ramos, José; Jorge, Henrique; ANTUNES VAZ, PAULO JOAQUIM; Vaz, Paulo; Silva, José; Wanzeller, Cristina; Abbasi, Maryam; Martins, Pedro; Silva, José; Wanzeller Guedes de Lacerda, Ana CristinaData privacy and ethical considerations ensure the security of databases by respecting individual rights while upholding ethical considerations when collecting, managing, and using information. Nowadays, despite having regulations that help to protect citizens and organizations, we have been presented with thousands of instances of data breaches, unauthorized access, and misuse of data related to such individuals and organizations. In this paper, we propose ethical considerations and best practices associated with critical data and the role of the database administrator who helps protect data. First, we suggest best practices for database administrators regarding data minimization, anonymization, pseudonymization and encryption, access controls, data retention guidelines, and stakeholder communication. Then, we present a case study that illustrates the application of these ethical implementations and best practices in a real-world scenario, showing the approach in action and the benefits of privacy. Finally, the study highlights the importance of a comprehensive approach to deal with data protection challenges and provides valuable insights for future research and developments in this field
- Distributed data warehouse resource monitoring'Publication . Martins, Pedro; Sá, Filipe; Caldeira, Filipe; Abbasi, MaryamIn this paper, we investigate the problem of providing scalability (out and in) to Extraction, Transformation, Load (ETL) and Querying (Q) (ETL+Q) process of data warehouses. In general, data loading, transformation, and integration are heavy tasks that are performed only periodically, instead of row by row. Parallel architectures and mechanisms can optimize the ETL process by speeding up each part of the pipeline process as more performance is needed. We propose parallelization solutions for each part of the ETL+Q, which we integrate into a framework, that is, an approach that enables the automatic scalability and freshness of any data warehouse and ETL+Q process. Our results show that the proposed system algorithms can handle scalability to provide the desired processing speed in big-data and small-data scenarios.
- Enhancing Visual Perception in Immersive VR and AR Environments: AI-Driven Color and Clarity Adjustments Under Dynamic Lighting ConditionsPublication . Abbasi, Maryam; Silva, José; Martins, Pedro; ANTUNES VAZ, PAULO JOAQUIM; Silva, JoséThe visual fidelity of virtual reality (VR) and augmented reality (AR) environments is essential for user immersion and comfort. Dynamic lighting often leads to chromatic distortions and reduced clarity, causing discomfort and disrupting user experience. This paper introduces an AI-driven chromatic adjustment system based on a modified U-Net architecture, optimized for real-time applications in VR/AR. This system adapts to dynamic lighting conditions, addressing the shortcomings of traditional methods like histogram equalization and gamma correction, which struggle with rapid lighting changes and real-time user interactions. We compared our approach with state-of-the-art color constancy algorithms, including Barron’s Convolutional Color Constancy and STAR, demonstrating superior performance. Experimental results from 60 participants show significant improvements, with up to 41% better color accuracy and 39% enhanced clarity under dynamic lighting conditions. The study also included eye-tracking data, which confirmed increased user engagement with AI-enhanced images. Our system provides a practical solution for developers aiming to improve image quality, reduce visual discomfort, and enhance overall user satisfaction in immersive environments. Future work will focus on extending the model’s capability to handle more complex lighting scenarios.
- Head-to-Head Evaluation of FDM and SLA in Additive Manufacturing: Performance, Cost, and Environmental PerspectivesPublication . Abbasi, Maryam; ANTUNES VAZ, PAULO JOAQUIM; Martins, Pedro; Silva, JoséThis paper conducts a comprehensive experimental comparison of two widely used additive manufacturing (AM) processes, Fused Deposition Modeling (FDM) and Stereolithography (SLA), under standardized conditions using the same test geometries and protocols. FDM parts were printed with both Polylactic Acid (PLA) and Acryloni trile Butadiene Styrene (ABS) filaments, while SLA used a general-purpose photopolymer resin. Quantitative evaluations included surface roughness, dimensional accuracy, ten sile properties, production cost, and energy consumption. Additionally, environmental considerations and process reliability were assessed by examining waste streams, recy clability, and failure rates. The results indicate that SLA achieves superior surface quality (Ra ≈ 2 µm vs. 12–13 µm) and dimensional tolerances (±0.05 mm vs. ±0.15–0.20 mm), along with higher tensile strength (up to 70 MPa). However, FDM provides notable ad vantages in cost (approximately 60% lower on a per-part basis), production speed, and energy efficiency. Moreover, from an environmental perspective, FDM is more favorable when using biodegradable PLA or recyclable ABS, whereas SLA resin waste is hazardous. Overall, the study highlights that no single process is universally superior. FDM offers a rapid, cost-effective solution for prototyping, while SLA excels in precision and surface finish. By presenting a detailed, data-driven comparison, this work guides engineers, product designers, and researchers in choosing the most suitable AM technology for their specific needs.
- Improving bluetooth beacon-based indoor location and fingerprintingPublication . Martins, Pedro; Abbasi, Maryam; Sá, Filipe; Cecílio, José; Morgado, Francisco; Caldeira, FilipeThe complex way radio waves propagate indoors, leads to the derivation of location using fngerprinting techniques. In this cases, location is computed relying on WiFi signals strength mapping. Recent Bluetooth low energy (BLE) provides new opportunities to explore positioning. In this work is studied how BLE beacons radio signals can be used for indoor location scenarios, as well as their precision. Additionally, this paper also introduces a method for beacon-based positioning, based on signal strength measurements at key distances for each beacon. This method allows to use diferent beacon types, brands, and location conditions/constraints. Depending on each situation (i.e., hardware and location) it is possible to adapt the distance measuring curve to minimize errors and support higher distances, while at the same time keeping good precision. Moreover, this paper also presents a comparison with traditional positioning method, using formulas for distance estimation, and the position triangulation. The proposed study is performed inside the campus of Viseu Polytechnic Institute, and tested using a group of students, each with his smart-phone, as proof of concept. Experimental results show that BLE allows having < 1.5 m error approximately 90% of the times, and the experimental results using the proposed location detection method show that the proposed position technique has 13.2% better precision than triangulation, for distances up to 10 m.