Cargando…

Random search and reproducibility for neural architecture search /

"Neural architecture search (NAS) is a promising research direction that has the potential to replace expert-designed networks with learned, task-specific architectures. Ameet Talwalkar (Carnegie Mellon University | Determined AI) shares work that aims to help ground the empirical results in th...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor Corporativo: O'Reilly Artificial Intelligence Conference
Formato: Electrónico Congresos, conferencias Video
Idioma:Inglés
Publicado: [Place of publication not identified] : O'Reilly, 2019.
Temas:
Acceso en línea:Texto completo (Requiere registro previo con correo institucional)

MARC

LEADER 00000cgm a2200000 i 4500
001 OR_on1127651198
003 OCoLC
005 20231017213018.0
006 m o c
007 cr cna||||||||
007 vz czazuu
008 191115s2019 xx 041 o vleng d
040 |a UMI  |b eng  |e rda  |e pn  |c UMI  |d UMI  |d OCLCF  |d OCLCQ  |d OCLCO 
029 1 |a AU@  |b 000066261537 
035 |a (OCoLC)1127651198 
037 |a CL0501000081  |b Safari Books Online 
050 4 |a QA76.87 
049 |a UAMI 
100 1 |a Talwalkar, Ameet,  |e on-screen presenter. 
245 1 0 |a Random search and reproducibility for neural architecture search /  |c Ameet Talwalkar. 
264 1 |a [Place of publication not identified] :  |b O'Reilly,  |c 2019. 
300 |a 1 online resource (1 streaming video file (40 min., 46 sec.)) 
336 |a two-dimensional moving image  |b tdi  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
337 |a video  |b v  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
511 0 |a Presenter, Ameet Talwalkar. 
500 |a Title from title screen (viewed November 14, 2019). 
518 |a Recorded April 17, 2019 at the O'Reilly Artificial Intelligence Conference in New York. 
520 |a "Neural architecture search (NAS) is a promising research direction that has the potential to replace expert-designed networks with learned, task-specific architectures. Ameet Talwalkar (Carnegie Mellon University | Determined AI) shares work that aims to help ground the empirical results in this field and proposes new NAS baselines that build off the following observations: NAS is a specialized hyperparameter optimization problem, and random search is a competitive baseline for hyperparameter optimization. Leveraging these observations, Ameet evaluates both random search with early-stopping and a novel random search with a weight-sharing algorithm on two standard NAS benchmarks: PTB and CIFAR-10. Results show that random search with early-stopping is a competitive NAS baseline that performs at least as well as ENAS, a leading NAS method, on both benchmarks. Additionally, random search with weight-sharing outperforms random search with early-stopping, achieving a state-of-the-art NAS result on PTB and a highly competitive result on CIFAR-10. Ameet concludes by exploring existing reproducibility issues for published NAS results, noting the lack of source material needed to exactly reproduce these results, and discussing the robustness of published results given the various sources of variability in NAS experimental setups."--Resource description page 
590 |a O'Reilly  |b O'Reilly Online Learning: Academic/Public Library Edition 
650 0 |a Neural networks (Computer science) 
650 0 |a Machine learning. 
650 0 |a Computer network architectures. 
650 2 |a Neural Networks, Computer 
650 6 |a Réseaux neuronaux (Informatique) 
650 6 |a Apprentissage automatique. 
650 6 |a Réseaux d'ordinateurs  |x Architectures. 
650 7 |a Computer network architectures.  |2 fast  |0 (OCoLC)fst00872277 
650 7 |a Machine learning.  |2 fast  |0 (OCoLC)fst01004795 
650 7 |a Neural networks (Computer science)  |2 fast  |0 (OCoLC)fst01036260 
711 2 |a O'Reilly Artificial Intelligence Conference  |d (2019 :  |c New York, N.Y.)  |j issuing body. 
856 4 0 |u https://learning.oreilly.com/videos/~/0636920339397/?ar  |z Texto completo (Requiere registro previo con correo institucional) 
994 |a 92  |b IZTAP