Repository logo
Acerca de Depósito
  • Español
  • English
Log In
  1. Home
  2. Browse by Department

Browsing by Department "Departamento de Informática"

Filter results by typing the first few letters
Now showing 1 - 20 of 49
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A Co-Evolutionary Scheme for Multi-Objective Evolutionary Algorithms Based on ϵ-Dominance
    (2019-01-01)
    Menchaca-Mendez, Adriana
    ;
    Montero, Elizabeth  
    ;
    Antonio, Luis Miguel
    ;
    Zapotecas-Martinez, Saul
    ;
    Coello Coello, Carlos A.Coello
    ;
    Riff Maria-cristina
    Convergence and diversity of solutions play an essential role in the design of multi-objective evolutionary algorithms (MOEAs). Among the available diversity mechanisms, the -dominance has shown a proper balance between convergence and diversity. When using -dominance, diversity is ensured by partitioning the objective space into boxes of size and, typically, a single solution is allowed at each of these boxes. However, there is no easy way to determine the precise value of . In this paper, we investigate how this goal can be achieved by using a co-evolutionary scheme that looks for the proper values of along the search without any need of a previous user’s knowledge. We include the proposed co-evolutionary scheme into an MOEA based on -dominance giving rise to a new MOEA. We evaluate the proposed MOEA solving standard benchmark test problems. According to our results, it is a promising alternative for solving multi-objective optimization problems because three main reasons: 1) it is competitive concerning stateof-the-art MOEAs, 2) it does not need extra information about the problem, and 3) it is computationally efficient
    Scopus© Citations 14
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A Constraint Programming Formulation of the Multi-Mode Resource-Constrained Project Scheduling Problem for the Flexible Job Shop Scheduling Problem
    (2023-01-01)
    Yuraszeck, Francisco
    ;
    Montero, Elizabeth  
    ;
    Canut-De-Bon, Dario
    ;
    Cuneo, Nicolas
    ;
    Rojel, Maximiliano
    In this work, a constraint programming (CP) formulation of the multi-mode resource-constrained project scheduling problem (MMRCPSP) is proposed for solving the flexible job shop scheduling problem (FJSSP) under the makespan minimization criterion. The resulting CP model allows us to tackle the classical instances of the FJSSP (such as where the operations of a given job follow a linear order). It can also handle FJSSP instances where the precedence relationships between operations are defined by an arbitrary directed acyclic graph (sequencing flexibility). The performance of our approach was tested using 271 classical FJSSP instances and 50 FJSSP instances with sequencing flexibility. We establish the validity of our approach by achieving an average relative percentage deviation of 3.04% and 0.18% when compared to the best-known lower and upper bounds, respectively. Additionally, we were able to contribute to the literature with ten new lower bounds and two new upper bounds. Our CP approach is relatively simple yet competitive and can be quickly applied and adapted by new practitioners in the area.
    Scopus© Citations 8
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A Data Ingestion Procedure towards a Medical Images Repository
    (2024-08-01)
    Solar, Mauricio  
    ;
    Castañeda, Victor
    ;
    Ñanculef , Ricardo  
    ;
    Dombrovskaia, Lioubov  
    ;
    Araya, Mauricio  
    This article presents an ingestion procedure towards an interoperable repository called ALPACS (Anonymized Local Picture Archiving and Communication System). ALPACS provides services to clinical and hospital users, who can access the repository data through an Artificial Intelligence (AI) application called PROXIMITY. This article shows the automated procedure for data ingestion from the medical imaging provider to the ALPACS repository. The data ingestion procedure was successfully applied by the data provider (Hospital Clínico de la Universidad de Chile, HCUCH) using a pseudo-anonymization algorithm at the source, thereby ensuring that the privacy of patients’ sensitive data is respected. Data transfer was carried out using international communication standards for health systems, which allows for replication of the procedure by other institutions that provide medical images. Objectives: This article aims to create a repository of 33,000 medical CT images and 33,000 diagnostic reports with international standards (HL7 HAPI FHIR, DICOM, SNOMED). This goal requires devising a data ingestion procedure that can be replicated by other provider institutions, guaranteeing data privacy by implementing a pseudo-anonymization algorithm at the source, and generating labels from annotations via NLP. Methodology: Our approach involves hybrid on-premise/cloud deployment of PACS and FHIR services, including transfer services for anonymized data to populate the repository through a structured ingestion procedure. We used NLP over the diagnostic reports to generate annotations, which were then used to train ML algorithms for content-based similar exam recovery. Outcomes: We successfully implemented ALPACS and PROXIMITY 2.0, ingesting almost 19,000 thorax CT exams to date along with their corresponding reports.
    Scopus© Citations 1
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A Heuristic Approach for Determining Efficient Vaccination Plans under a SARS-CoV-2 Epidemic Model
    (2023-02-01)
    Hazard-Valdés, Claudia
    ;
    Montero, Elizabeth  
    In this work, we propose a local search-based strategy to determine high-quality allocation of vaccines under restricted budgets and time periods. For this, disease spread is modeled as a SEAIR pandemic model. Subgroups are used to understand and evaluate movement restrictions and their effect on interactions between geographical divisions. A tabu search heuristic method is used to determine the number of vaccines and the groups to allocate them in each time period, minimizing the maximum number of infected people at the same time and the total infected population. Available data for COVID-19 daily cases was used to adjust the parameters of the SEAIR models in four study cases: Austria, Belgium, Denmark, and Chile. From these, we can analyze how different vaccination schemes are more beneficial for the population as a whole based on different reproduction numbers, interaction levels, and the availability of resources in each study case. Moreover, from these experiments, a strong relationship between the defined objectives is noticed.
    Scopus© Citations 5
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A machine learning method for high-frequency data forecasting
    (2014-01-01)
    López, Erick
    ;
    Allende , Héctor  
    ;
    Allende-Cid, Héctor
    In recent years several models for financial high-frequency data have been proposed. One of the most known models for this type of applications is the ACM-ACD model. This model focuses on modelling the underlying joint distribution of both duration and price changes between consecutive transactions. However this model imposes distributional assumptions and its number of parameters increases rapidly (producing a complex and slow adjustment process). Therefore, we propose using two machine learning models, that will work sequentially, based on the ACM-ACD model. The results show a comparable performance, achieving a better performance in some cases. Also the proposal achieves a significatively more rapid convergence. The proposal is validated with a well-known financial data set.
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A model to assess open government data in public agencies
    (2012-09-05)
    Solar, Mauricio  
    ;
    Concha, Gastón
    ;
    Meijueiro, Luis
    In this article a maturity model is proposed, named OD-MM (Open Data Maturity Model) to assess the commitment and capabilities of public agencies in pursuing the principles and practices of open data. The OD-MM model has a three level hierarchical structure, called domains, sub-domains and critical variables. Four capacity levels are defined for each of the 33 critical variables distributed in nine sub-domains in order to determine the organization maturity level. The model is a very valuable diagnosis tool for public services, given it shows all weaknesses and the way (a roadmap) to progress in the implementation of open data.
    Scopus© Citations 48
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A search for an unexpected asymmetry in the production of e+μ− and e−μ+ pairs in proton–proton collisions recorded by the ATLAS detector at s=13 TeV
    (2022-07-10)
    Aad, G.
    ;
    Abbott, B.
    ;
    Abbott, D. C.
    ;
    Abed Abud, A.
    ;
    Abeling, K.
    ;
    Abhayasinghe, D. K.
    ;
    Abidi, S. H.
    ;
    Aboulhorma, A.
    ;
    Abramowicz, H.
    ;
    Abreu, H.
    ;
    Abulaiti, Y.
    ;
    Abusleme Hoffman, A. C.
    ;
    Acharya, B. S.
    ;
    Achkar, B.
    ;
    Adam, L.
    ;
    Adam Bourdarios, C.
    ;
    Adamczyk, L.
    ;
    Adamek, L.
    ;
    Addepalli, S. V.
    ;
    Adelman, J.
    ;
    Adiguzel, A.
    ;
    Adorni, S.
    ;
    Adye, T.
    ;
    Affolder, A. A.
    ;
    Afik, Y.
    ;
    Agapopoulou, C.
    ;
    Agaras, M. N.
    ;
    Agarwala, J.
    ;
    Aggarwal, A.
    ;
    Agheorghiesei, C.
    ;
    Aguilar-Saavedra, J. A.
    ;
    Ahmad, A.
    ;
    Ahmadov, F.
    ;
    Ahmed, W. S.
    ;
    Ai, X.
    ;
    Aielli, G.
    ;
    Aizenberg, I.
    ;
    Akatsuka, S.
    ;
    Akbiyik, M.
    ;
    Åkesson, T. P.A.
    ;
    Akimov, A. V.
    ;
    Al Khoury, K.
    ;
    Alberghi, G. L.
    ;
    Albert, J.
    ;
    Albicocco, P.
    ;
    Alconada Verzini, M. J.
    ;
    Alderweireldt, S.
    ;
    Aleksa, M.
    ;
    Aleksandrov, I. N.
    ;
    Alexa, C.
    ;
    Alexopoulos, T.
    ;
    Alfonsi, A.
    ;
    Alfonsi, F.
    ;
    Alhroob, M.
    ;
    Ali, B.
    ;
    Ali, S.
    ;
    Aliev, M.
    ;
    Alimonti, G.
    ;
    Allaire, C.
    ;
    Allbrooke, B. M.M.
    ;
    Allport, P. P.
    ;
    Aloisio, A.
    ;
    Alonso, F.
    ;
    Alpigiani, C.
    ;
    Alunno Camelia, E.
    ;
    Alvarez Estevez, M.
    ;
    Alviggi, M. G.
    ;
    Amaral Coutinho, Y.
    ;
    Ambler, A.
    ;
    Ambroz, L.
    ;
    Amelung, C.
    ;
    Amidei, D.
    ;
    Amor Dos Santos, S. P.
    ;
    Amoroso, S.
    ;
    Amos, K. R.
    ;
    Amrouche, C. S.
    ;
    Ananiev, V.
    ;
    Anastopoulos, C.
    ;
    Andari, N.
    ;
    Andeen, T.
    ;
    Anders, J. K.
    ;
    Andrean, S. Y.
    ;
    Andreazza, A.
    ;
    Angelidakis, S.
    ;
    Angerami, A.
    ;
    Anisenkov, A. V.
    ;
    Annovi, A.
    ;
    Antel, C.
    ;
    Anthony, M. T.
    ;
    Antipov, E.
    ;
    Antonelli, M.
    ;
    Antrim, D. J.A.
    ;
    Anulli, F.
    ;
    Brooks, William K.  
    ;
    CARQUIN, EDSON  
    ;
    Pezoa, Raquel  
    ;
    C.M. Robles Gajardo
    ;
    Viaux Maira, Nicolas  
    ;
    Araujo Ferraz, V.
    ;
    Arcangeletti, C.
    This search, a type not previously performed at ATLAS, uses a comparison of the production cross sections for e+μ− and e−μ+ pairs to constrain physics processes beyond the Standard Model. It uses 139 fb−1 of proton–proton collision data recorded at √s = 13 TeV at the LHC. Targeting sources of new physics which prefer final states containing e+μ− to e−μ+, the search contains two broad signal regions which are used to provide model-independent constraints on the ratio of cross sections at the 2% level. The search also has two special selections targeting supersymmetric models and leptoquark signatures. Observations using one of these selections are able to exclude, at 95% confidence level, singly produced smuons with masses up to 640 GeV in a model in which the only other light sparticle is a neutralino when the Rparity-violating coupling λ 231 is close to unity. Observations using the other selection exclude scalar leptoquarks with masses below 1880 GeV when geu 1R = g μc 1R = 1, at 95% confidence level. The limit on the coupling reduces to geu 1R = g μc 1R = 0.46 for a mass of 1420 GeV.
    Scopus© Citations 4
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A Study on Information Disorders on Social Networks during the Chilean Social Outbreak and COVID-19 Pandemic
    (MDPI AG, 2023-05-01)
    Mendoza, Marcelo
    ;
    Valenzuela, Sebastián
    ;
    Núñez-Mussa, Enrique
    ;
    Padilla, Fabián
    ;
    Providel, Eliana
    ;
    Campos, Sebastián
    ;
    Bassi, Renato
    ;
    Riquelme, Andrea
    ;
    Aldana, Valeria
    ;
    López, Claudia  
    Information disorders on social media can have a significant impact on citizens’ participation in democratic processes. To better understand the spread of false and inaccurate information online, this research analyzed data from Twitter, Facebook, and Instagram. The data were collected and verified by professional fact-checkers in Chile between October 2019 and October 2021, a period marked by political and health crises. The study found that false information spreads faster and reaches more users than true information on Twitter and Facebook. Instagram, on the other hand, seemed to be less affected by this phenomenon. False information was also more likely to be shared by users with lower reading comprehension skills. True information, on the other hand, tended to be less verbose and generate less interest among audiences. This research provides valuable insights into the characteristics of misinformation and how it spreads online. By recognizing the patterns of how false information diffuses and how users interact with it, we can identify the circumstances in which false and inaccurate messages are prone to becoming widespread. This knowledge can help us to develop strategies to counter the spread of misinformation and protect the integrity of democratic processes.
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    A survey on the dynamic scheduling problem in astronomical observations
    (2010-09-30)
    Mora, Matias
    ;
    Solar, Mauricio  
    The tasks execution scheduling is a common problem in computer science. The typical problem, as in industrial or computer processing applications, has some restrictions that are inapplicable for certain cases. For example, all available tasks have to be executed at some point, and ambient factors do not affect the execution order. In the astronomical observations field, projects are scheduled as observation blocks, and their execution depends on parameters like science goals priority and target visibility, but is also restricted by external factors: atmospheric conditions, equipment failure, etc. A telescope scheduler is mainly in charge of handling projects, commanding the telescope’s high level movement to targets, and starting data acquisition. With the growth of observatories’ capacities and maintenance costs, it is now mandatory to optimize the observation time allocation. Currently, at professional observatories there is still strong human intervention dependency, with no fully automatic solution so far. This paper aims to describe the dynamic scheduling problem in astronomical observations, and to provide a survey on existing solutions, opening some new application opportunities for computer science.
    Scopus© Citations 11
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    An Improved S-Metric Selection Evolutionary Multi-Objective Algorithm with Adaptive Resource Allocation
    (2018-01-01)
    Menchaca-Méndez, Adriana
    ;
    Montero Ureta, Elizabeth Del Carmen  
    ;
    Zapotecas-Martínez, Saúl
    One of the main disadvantages of evolutionary multi-objective algorithms (EMOAs) based on hypervolume is the computational cost of the hypervolume computation. This deficiency gets worse either when an EMOA calculates the hypervolume several times or when it is dealing with problems having more than three objectives. In this sense, some researchers have designed strategies to reduce the number of hypervolume calculations. Among them, the use of the locality property of the hypervolume has emerged as an alternative to deal with this problem. This property states that if a solution is moving in its neighborhood, only its contribution is affected and the contributions of the rest of the solutions remain the same. In this paper, we present a novel evolutionary approach that exploits the locality property of the hypervolume. The proposed approach adopts a probability to use two or three individuals in its environmental selection procedure. In this way, it only needs to compute two or three hypervolume contributions per iteration. The proposed algorithm is evaluated by solving the standard benchmark test problems and two real-world applications where the features of the problems are unknown. According to the results, the proposed approach is a promising alternative for solving problems with a high number of objectives because of three main reasons: 1) it is competitive with respect to the state-of-the-art EMOAs based on hypervolume; 2) it does not need extra information about the problem (which is particularly essential when solving real-world applications); and 3) its computational cost is much lower than the other hypervolume-based EMOAs.
    Scopus© Citations 9
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Application of meta-heuristic approaches in the spectral power clustering technique (SPCT) to improve the separation of partial discharge and electrical noise sources
    (2019-01-01)
    Ardila-Rey, Jorge Alfredo  
    ;
    Poblete, Nicolás Medina
    ;
    Montero, Elizabeth  
    In order to achieve an adequate diagnosis of the insulation system in any electrical asset it is necessary to carry out a proper separation process after measuring partial discharges (PD), since during the data acquisition it is very likely that simultaneous PD sources and electrical noise have been measured. Clearly, such separation will simplify the subsequent identification process, because the analysis will be done individually for each of the sources and not over the total of the signals. In this sense, the Spectral Power Clustering Technique (SPCT) has proven to be an effective technique when separating multiple sources acting simultaneously in a monitoring process. The effectiveness of this separation technique is fundamentally based on the proper selection of frequency bands or separation intervals, where the spectral power of the pulses is different for each source. In the case of selecting the wrong bands, the clusters will overlap, hiding the presence of the total number of sources. This research evaluates the performance of different meta-heuristic algorithms when applied to the SPCT for selecting separation intervals. The results obtained from the measurements made in different test objects will allow determining the most appropriate technique for separating PD sources and electrical noise acting simultaneously over an insulation system.
    Scopus© Citations 8
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Artificial neural network (ANN) modelling to estimate bubble size from macroscopic image and object features
    (2023-01-01)
    Vinnett, Luis  
    ;
    León, Roberto  
    ;
    Mesa, Diego
    Bubble size measurements in aerated systems such as froth flotation cells are critical for controlling gas dispersion. Commonly, bubbles are measured by obtaining representative photographs, which are then analyzed using segmentation and identification software tools. Recent developments have focused on enhancing these segmentation tools. However, the main challenges around complex bubble cluster segmentation remain unresolved, while the tools to tackle these challenges have become increasingly complex and computationally expensive. In this work, we propose an alternative solution, circumventing the need for image segmentation and bubble identification. An Artificial Neural Network (ANN) was trained to estimate the Sauter mean bubble size (D<sub>32</sub>) based on macroscopic image features obtained with simple and inexpensive image analysis. The results showed excellent prediction accuracy, with a correlation coefficient, R, over 0.998 in the testing stage, and without bias in its error distribution. This machine learning tool paves the way for robust and fast estimation of bubble size under complex bubble images, without the need of image segmentation.
    Scopus© Citations 1
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Azimuthal Angle Correlations of Muons Produced via Heavy-Flavor Decays in 5.02 TeV Pb+Pb and pp Collisions with the ATLAS Detector
    (2024-05-17)
    Filmer, E. K.
    ;
    Grant, C. M.
    ;
    Jackson, P.
    ;
    Kong, A. X.Y.
    ;
    Pandya, H. D.
    ;
    Potti, H.
    ;
    Ruggeri, T. A.
    ;
    Ting, E. X.L.
    ;
    White, M. J.
    ;
    Gingrich, D. M.
    ;
    Lindon, J. H.
    ;
    Nishu, N.
    ;
    Pinfold, J. L.
    ;
    Cakir, O.
    ;
    Yildiz, H. Duran
    ;
    Kuday, S.
    ;
    Turk Cakir, I.
    ;
    Sultansoy, S.
    ;
    Adam Bourdarios, C.
    ;
    Arnaez, O.
    ;
    Berger, N.
    ;
    Castillo, F. L.
    ;
    Costanza, F.
    ;
    Delmastro, M.
    ;
    Di Ciaccio, L.
    ;
    Hryn’ova, T.
    ;
    Jézéquel, S.
    ;
    Koletsou, I.
    ;
    Levêque, J.
    ;
    Lewis, D. J.
    ;
    Little, J. D.
    ;
    Lorenzo Martinez, N.
    ;
    Poddar, G.
    ;
    Sanchez Pineda, A.
    ;
    Sauvan, E.
    ;
    Bernardi, G.
    ;
    Bomben, M.
    ;
    Li, A.
    ;
    Li, T.
    ;
    Marchiori, G.
    ;
    Nakkalil, K.
    ;
    Shen, Q.
    ;
    Zhang, Y.
    ;
    Chekanov, S.
    ;
    Darmora, S.
    ;
    Hopkins, W. H.
    ;
    Hoya, J.
    ;
    Love, J.
    ;
    Luongo, N. A.
    ;
    Metcalfe, J.
    ;
    Mete, A. S.
    ;
    Paramonov, A.
    ;
    Proudfoot, J.
    ;
    Van Gemmeren, P.
    ;
    Wamorkar, T.
    ;
    Wang, R.
    ;
    Zhang, J.
    ;
    Cheu, E.
    ;
    Cui, Z.
    ;
    Ghosh, A.
    ;
    Johns, K. A.
    ;
    Lampl, W.
    ;
    Lindley, R. E.
    ;
    Loch, P.
    ;
    Rutherfoord, J. P.
    ;
    Sardain, J.
    ;
    Varnes, E. W.
    ;
    Zhou, H.
    ;
    Zhou, Y.
    ;
    Bakshi Gupta, D.
    ;
    Burghgrave, B.
    ;
    Cardenas, J. C.J.
    ;
    De, K.
    ;
    Farbin, A.
    ;
    Hadavand, H. K.
    ;
    Myers, A. J.
    ;
    Ozturk, N.
    ;
    Usai, G.
    ;
    White, A.
    ;
    Angelidakis, S.
    ;
    Fassouliotis, D.
    ;
    Fountas, L.
    ;
    Gkialas, I.
    ;
    Kourkoumelis, C.
    ;
    Alexopoulos, T.
    ;
    Drivas-Koulouris, I.
    ;
    Gazis, E. N.
    ;
    Kitsaki, C.
    ;
    Maltezos, S.
    ;
    Paraskevopoulos, C.
    ;
    Perganti, M.
    ;
    Tzanis, P.
    ;
    Andeen, T.
    ;
    Burton, C. D.
    ;
    Choi, K.
    ;
    Onyisi, P. U.E.
    ;
    Panchal, D. K.
    ;
    Tost, M.
    ;
    Unal, M.
    ;
    Huseynov, N.
    ;
    Cristian Andres Allendes Flores  
    ;
    BROOKS, WILLIAM K.  
    ;
    CARQUIN, EDSON  
    ;
    Fernandez Luengo, S.I.
    ;
    Fuenzalida Garrido, S.
    ;
    PEZOA, RAQUEL  
    ;
    Carolina Robles Gajardo
    ;
    Tapia Araya, S.
    ;
    VIAUX MAIRA, NICOLAS  
    Angular correlations between heavy quarks provide a unique probe of the quark-gluon plasma created in ultrarelativistic heavy-ion collisions. Results are presented of a measurement of the azimuthal angle correlations between muons originating from semileptonic decays of heavy quarks produced in 5.02 TeV Pb+Pb and 𝑝⁢𝑝 collisions at the LHC. The muons are measured with transverse momenta and pseudorapidities satisfying 𝑝𝜇T>4  GeV and |𝜂𝜇|<2.4, respectively. The distributions of azimuthal angle separation Δ⁢𝜙 for muon pairs having pseudorapidity separation |Δ⁢𝜂|>0.8, are measured in different Pb+Pb centrality intervals and compared to the same distribution measured in 𝑝⁢𝑝 collisions at the same center-of-mass energy. Results are presented separately for muon pairs with opposite-sign charges, same-sign charges, and all pairs. A clear peak is observed in all Δ⁢𝜙 distributions at Δ⁢𝜙∼𝜋, consistent with the parent heavy-quark pairs being produced via hard-scattering processes. The widths of that peak, characterized using Cauchy-Lorentz fits to the Δ⁢𝜙 distributions, are found to not vary significantly as a function of Pb+Pb collision centrality and are similar for 𝑝⁢𝑝 and Pb+Pb collisions. This observation will provide important constraints on theoretical descriptions of heavy-quark interactions with the quark-gluon plasma.
    Scopus© Citations 2
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Behind the myths of citizen participation: Identifying sustainability factors of hyper-local information systems
    (2017-11-01)
    López, Claudia  
    ;
    Farzan, Rosta
    ;
    Lin, Yu Ru
    Various information systems have emerged to facilitate citizen participation in the life of their communities. However, there is a lack of robust understanding of what enables the sustainability of such systems. This work introduces a framework to identify and analyze various factors that influence the sustainability of “hyper-local” information systems. Using longitudinal observations of participation from 35 online neighborhood discussion forums over six years, we analyze the relationship between sustainability and online–offline community characteristics. Our results not only show patterns consistent with previous observations but reveal the dubious influences of member heterogeneity and network structure. Design insights are discussed.
    Scopus© Citations 11
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Bridging the gap between software architecture rationale formalisms and actual architecture documents: An ontology-driven approach
    (2012-01-01)
    López, Claudia  
    ;
    Codocedo, Víctor
    ;
    Astudillo, Hernán
    ;
    Cysneiros, Luiz Marcio
    Documenting software architecture rationale is essential to reuse and evaluate architectures, and several modeling and documentation guidelines have been proposed in the literature. However, in practice creating and updating these documents rarely is a primary activity in most software projects, and rationale remains hidden in casual and semi-structured records, such as e-mails, meeting notes, wikis, and specialized documents. This paper describes the TREx (Toeska Rationale Extraction) approach to recover, represent and explore rationale information from text documents, combining: (1) pattern-based information extraction to recover rationale; (2) ontology-based representation of rationale and architectural concepts; and (3) facet-based interactive exploration of rationale. Initial results from TREx’s application suggest that some kinds of architecture rationale can be semi-automatically extracted from a project’s unstructured text documents, namely decisions, alternatives and requirements. The approach and some tools are illustrated with a case study of rationale recovery for a financial securities settlement system.
    Scopus© Citations 45
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Classical Machine Learning Techniques in the Search of Extrasolar Planets
    (Centro Latino Americano de Estudios en Informatica, 2019-01-01)
    Bugueño, Margarita  
    ;
    Mena, Francisco
    ;
    Araya, Mauricio  
    The field of astronomical data analysis has experienced an important paradigm shift in the recent years. The automation of certain analysis procedures is no longer a desirable feature for reducing the human effort, but a must have asset for coping with the extremely large datasets that new instrumentation technologies are producing. In particular, the detection of transit planets --- bodies that move across the face of another body --- is an ideal setup for intelligent automation. Knowing if the variation within a light curve is evidence of a planet, requires applying advanced pattern recognition methods to a very large number of candidate stars. Here we present a supervised learning approach to refine the results produced by a case-by-case analysis of light-curves, harnessing the generalization power of machine learning techniques to predict the currently unclassified light-curves. The method uses feature engineering to find a suitable representation for classification, and different performance criteria to evaluate them and decide. Our results show that this automatic technique can help to speed up the very time-consuming manual process that is currently done by expert scientists.
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Connecting the Dots: What Graph-Based Text Representations Work Best for Text Classification using Graph Neural Networks?
    (2023-01-01)
    Bugueño, Margarita  
    ;
    de Melo, Gerard
    Given the success of Graph Neural Networks (GNNs) for structure-aware machine learning, many studies have explored their use for text classification, but mostly in specific domains with limited data characteristics. Moreover, some strategies prior to GNNs relied on graph mining and classical machine learning, making it difficult to assess their effectiveness in modern settings. This work extensively investigates graph representation methods for text classification, identifying practical implications and open challenges. We compare different graph construction schemes using a variety of GNN architectures and setups across five datasets, encompassing short and long documents as well as unbalanced scenarios in diverse domains. Two Transformer-based large language models are also included to complement the study. The results show that i) although the effectiveness of graphs depends on the textual input features and domain, simple graph constructions perform better the longer the documents are, ii) graph representations are especially beneficial for longer documents, outperforming Transformer-based models, iii) graph methods are particularly efficient at solving the task.
    Scopus© Citations 4
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Deep learning techniques to process 3D chest CT
    (2024-01-01)
    Solar, Mauricio  
    ;
    Aguirre Olea, Pablo Leopoldo  
    The idea of using X&ndash;rays and Computed Tomography (CT) images as diagnostic method has been explored in several studies. Most of these studies work with slices of CT image in 2D, requiring less computational capacity and less time to process them than 3D. The processing of volumetric data (the complete CT images in 3D) adds an extra dimension of information. However, the magnitude of the data is considerably larger than working with slices in 2D, so extra computational processing is required. In this study a model capable of performing a classification of a 3D input that represents the volume of the CT scan is proposed. The model is able to classify the 3D input between COVID&ndash;19 and Non&ndash;COVID&ndash;19, but reducing the use of resources when performing the classification. The proposed model is the ResNet&ndash;50 model with a new dimension of information added, which is a simple autoencoder. This autoencoder is trained on the same dataset, and a vector representation of each exam is generated and used together with the exams to feed the ResNet&ndash;50. To validate the proposal, the same proposed model is compared with and without the autoencoder module that provides more information to the proposed model. The proposed model obtains better metrics than the same model without the autoencoder, confirming that extracting relevant features from the dataset helps improve the performance of the model.
    Scopus© Citations 2
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Development of a virtual model of fibro-bronchoscopy
    (2011-09-01)
    Solar, Mauricio  
    ;
    Ducoing, Eugenio
    A virtual model of fibro-bronchoscopy is reported. The virtual model represents in 3D the trachea and the bronchi creating a virtual world of the bronchial tree. The bronchoscope is modeled to look over the bronchial tree imitating the displacement and rotation of the real bronchoscope. The parameters of the virtual model were gradually adjusted according to expert opinion and allowed the training of specialists with a virtual bronchoscope of great realism. The virtual bronchial tree provides clues of reality regarding the movement of the bronchoscope, creating the illusion that the virtual instrument is behaving as the real one with all the benefits in costs that this means.
    Scopus© Citations 1
  • Loading...
    Thumbnail Image
    Some of the metrics are blocked by your 
    consent settings
    Publication
    Dynamic image segmentation method using hierarchical clustering
    (2009-12-01)
    Galbiati, Jorge
    ;
    Allende , Héctor  
    ;
    Becerra, Carlos
    In this paper we explore the use of the cluster analysis in segmentation problems, that is, identifying image points with an indication of the region or class they belong to. The proposed algorithm uses the well known agglomerative hierarchical cluster analysis algorithm in order to form clusters of pixels, but modified so as to cope with the high dimensionality of the problem. The results of different stages of the algorithm are saved, thus retaining a collection of segmented images ordered by degree of segmentation. This allows the user to view the whole collection and choose the one that suits him best for his particular application.
    Scopus© Citations 6
  • «
  • 1 (current)
  • 2
  • 3
  • »

UNIVERSIDAD

  • Nuestra Historia
  • Federico Santa María
  • Definiciones Estratégicas
  • Modelo Educativo
  • Organización
  • Información Estadística USM

CAMPUS Y SEDES

  • Información Campus y Sedes
  • Tour Virtual
  • Icono Seguridad Política de Privacidad

EXTENSIÓN Y CULTURA

  • Dirección de Comunicaciones Estratégicas y Extensión Cultural
  • Dirección General de Vinculación con el Medio
  • Dirección de Asuntos Internacionales
  • Alumni
  • Noticias
  • Eventos
  • Radio USM
  • Cultura USM

SERVICIOS

  • Aula USM
  • Biblioteca USM
  • Portal de Autoservicio Institucional
  • Dirección de Tecnologías de la Información
  • Portal de Reportes UDAI
  • Sistema de Información de Gestión Académica
  • Sistema Integrado de Información Argos ERP
  • Sistema de Remuneraciones Históricas
  • Directorio USM
  • Trabaja con nosotros
Acreditación USM
usm.cl
Logo Acceso
Logo Consejo de Rectores
Logo G9
Logo AUR
Logo CRUV
Logo REUNA
Logo Universia

DSpace software copyright © 2002-2026 LYRASIS

  • Privacy policy
  • End User Agreement
  • Send Feedback