Now showing 1 - 2 of 2
  • Publication
    Multicategory SVMs by minimizing the distances among convex-hull prototypes
    (2008-11-10) ;
    Concha, Carlos
    ;
    Candel, Diego
    ;
    ;
    Moraga, Claudio
    In this paper, we study a single objective extension of support vector machines for multicategory classification. Extending the dual formulation of binary SVMs, the algorithm looks for minimizing the sum of all the pairwise distances among a set of prototypes, each one constrained to one of the convex-hulls enclosing a class of examples. The final discriminant system is built looking for an appropriate reference point in the feature space. The obtained method preserves the form and complexity of the binary case, optimizing just one convex objective function with m variables and 2m+K constraints, where m is the number of examples and K the number of classes. Non-linear extension are straightforward using kernels while soft margin versions can be obtained by using reduced convex hulls. Experimental results in well-known UCI benchmarks are presented, comparing the accuracy and efficiency of the proposed approach with other state-of-the-art methods.
  • Publication
    A Data Ingestion Procedure towards a Medical Images Repository
    This article presents an ingestion procedure towards an interoperable repository called ALPACS (Anonymized Local Picture Archiving and Communication System). ALPACS provides services to clinical and hospital users, who can access the repository data through an Artificial Intelligence (AI) application called PROXIMITY. This article shows the automated procedure for data ingestion from the medical imaging provider to the ALPACS repository. The data ingestion procedure was successfully applied by the data provider (Hospital Clínico de la Universidad de Chile, HCUCH) using a pseudo-anonymization algorithm at the source, thereby ensuring that the privacy of patients’ sensitive data is respected. Data transfer was carried out using international communication standards for health systems, which allows for replication of the procedure by other institutions that provide medical images. Objectives: This article aims to create a repository of 33,000 medical CT images and 33,000 diagnostic reports with international standards (HL7 HAPI FHIR, DICOM, SNOMED). This goal requires devising a data ingestion procedure that can be replicated by other provider institutions, guaranteeing data privacy by implementing a pseudo-anonymization algorithm at the source, and generating labels from annotations via NLP. Methodology: Our approach involves hybrid on-premise/cloud deployment of PACS and FHIR services, including transfer services for anonymized data to populate the repository through a structured ingestion procedure. We used NLP over the diagnostic reports to generate annotations, which were then used to train ML algorithms for content-based similar exam recovery. Outcomes: We successfully implemented ALPACS and PROXIMITY 2.0, ingesting almost 19,000 thorax CT exams to date along with their corresponding reports.