Lucia Tutorial
Repository providing examples on how to use Lucia for ML with focus on the virtual environment setup.
Enhancing OSA Assessment with Explainable AI
Explainable Artificial Intelligence (xAI) is a rapidly growing field that focuses on making deep learning models interpretable and understandable to human decision-makers. In this study, we introduce xAAEnet, a novel xAI model applied to the assessment of Obstructive Sleep Apnea (OSA) severity. OSA is a prevalent sleep disorder that can lead to numerous medical conditions and is currently assessed using the Apnea-Hypopnea Index (AHI). However, AHI has been criticized for its inability to accurately estimate the effect of OSAs on related medical conditions.
Compteur Entry
Compteur Entry
Compteur Entry
Compteur Entry
Augmentation des données
Description
Dans le monde actuel, l'intelligence artificielle (IA) est omn
Reliability of Generative AI
Task 1: Uncertainty Estimation in QA Task
The goal of this task is to estimate the uncertainty of different Large Language Models (LLMs)
in a Question Answering (QA) task. To achieve this, the CRAG dataset—one of the most recent
datasets in this domain—is utilized. Originally released by Meta for Retrieval-Augmented
Generation (RAG) benchmarks, CRAG provides valuable classes for evaluating uncertainty
methods. The dataset categorizes questions across five domains (finance, sports, music, movies, and
Explainable recommander system
Task : Empirical Study of LLM-Enhanced Explainable Recommendation From an HCI and ML Perspective
Recommender systems (RS) regroup a set of information filtering techniques whose purpose is to
recommend to a user a selection of items from a generally large corpus. These items are chosen based
on the user’s preferences and characteristics, deduced from the history of their interactions with items
of the given corpus. Explainable RS simply provide explanations of the recommendation process to