## Xian's Og

### Statistics first slides

**T**oday I started my new course of Statistics for our third year undergraduates. In English! A point that came as a surprise for the students but I got no complaint (so far) and they started asking questions in English during the class. The slides are “under construction” and this first chapter borrows a fair chunk from Andrew’s blog entries. Including the last slide on the six Kaiser Fung quotes, which was posted yesterday night. The next chapter is going to be more standard, with statistical models, limit theorems, and exponential families.

Filed under: Books, Kids, Statistics, University life

### Series B reaches 5.721 impact factor!

**I** received this email from Wiley with the great figure that JRSS Series B has now reached a 5.721 impact factor. Which makes it the first journal in Statistics from this perspective. Congrats to editors Gareth Roberts, Piotr Fryzlewicz and Ingrid Van Keilegom for this achievement! An amazing jump from the 2009 figure of 2.84…!

Filed under: Books, Statistics, University life Tagged: impact factor, John Wiley, JRSSB, Series B

### xkcd [interview & book]

**O**f interest for xkcd fans: What If?: Serious Scientific Answers to Absurd Hypothetical Questions is out! Actually, it is currently the #1 bestseller on amazon! (A physics book makes it to the top of the bestseller list, a few weeks after a theoretical economics book got there. Nice! Actually, a statistics book also made it to the top: Nate Silver’s The SIgnal and the Noise….) I did not read the book, but it is made of some of the questions answered by Randall Munroe (the father of xkcd) on his what if blog. In connection with this publication, Randall Munroe is interviewed on FiveThirtyEight (Nate Silver’s website), as kindly pointed out to me by Bill Jefferys. The main message is trying to give people a feeling about numbers, a rough sense of numeracy. Which was also the purpose of the guesstimation books.

Filed under: Books, Kids, Statistics Tagged: Amazon, bestseller, book review, FiveThirtyEight, Guesstimation, Nate Silver, what if?, xkcd

### available dark [book review]

*“Paved roads had long ago surrended to gravel tracks that disappeared into a desert of snow covered lava. Black spires like a forest of charred trees blotted out the stars near the horizon.”*

**T**his is the last book I read from my Amazon package: *Available Dark* by Elizabeth Hand. I cannot remember how I came to order it… Maybe a confusion with another fantasy author like Elizabeth Moon? Or simply because the story was taking place between Maine, Finland and Iceland?! Anyway, I read the book within two days during a short hiking trip to the volcano region of Central France. The plot has indeed a mesmerizing quality that made me keep reading further and further at ungodly hours. (With the help of an US jetlag.) It is original and intense enough to overcome the major difficulty that the central character, Cas, is far from sympathetic, from specialising in corpse photography to being almost constantly on drugs. But the construction of the plot and the introduction of the characters, always seen from Cas’ viewpoint, are well-done, even though the ending is both precipitated and unrealistic. Too many coincidences. The original setup of this novel is the Finnish black metal scene, with its undercurrents of satanism, ritual murders, and church burnings. Rather accurate judging from the wikipedia page on the topic! What I appreciated most was the description of the first impression of Iceland on Cas, when she landed from Helsinki. *“The trip to Reykjavik [from the airport] was like a bus tour through Mordor. Black lava fields, an endless waste broken here and there by ruined machinery or a building of stained corrugated metal.”* So I may consider reading another novel in the series in a near future…

Filed under: Books, Mountains, Travel Tagged: Available Dark, black metal, Elizabeth Hand, Finland, Helsinki, Iceland, Reykjavik

### 3,000 posts and 1,000,000 views so far…

**A**s the ‘Og went over its [first] million views and 3,000 posts since its first post in October 2008, the most popular entries (lots of book reviews, too many obituaries, and several guest posts):

and the most frequent search terms (excluding those connected with my name), with again two beach towns at the top!

benidorm 1,804 surfers paradise 1,050 george casella 785 mont blanc 705 introducing monte carlo methods with r 587 marie curie 500 mistborn 480 millenium 413 i love r 411 andrew wyeth 398 abele blanc 385 bayesian p value 375 bayesian p-value 374 walter bonatti 351 nested sampling 333 particle mcmc 332 dumplings 298

Filed under: Books, Kids, Statistics Tagged: book reviews, guest post

### my life as a mixture [BAYSM 2014, Wien]

**N**ext week I am giving a talk at BAYSM in Vienna. BAYSM is the Bayesian *Young* Statisticians meeting so one may wonder why, but with Chris Holmes and Mike West, we got invited as more… erm… senior speakers! So I decided to give a definitely *senior* talk on a thread pursued throughout my career so far, namely mixtures. Plus it also relates to works of the other senior speakers. Here is the abstract for the talk:

Mixtures of distributions are fascinating objects for statisticians in that they both constitute a straightforward extension of standard distributions and offer a complex benchmark for evaluating statistical procedures, with a likelihood both computable in a linear time and enjoying an exponential number of local models (and sometimes infinite modes). This fruitful playground appeals in particular to Bayesians as it constitutes an easily understood challenge to the use of improper priors and of objective Bayes solutions. This talk will review some ancient and some more recent works of mine on mixtures of distributions, from the 1990 Gibbs sampler to the 2000 label switching and to later studies of Bayes factor approximations, nested sampling performances, improper priors, improved importance samplers, ABC, and a inverse perspective on the Bayesian approach to testing of hypotheses.

**I** am very grateful to the scientific committee for this invitation, as it will give me the opportunity to meet the new generation, learn from them and in addition discover Vienna where I have never been, despite several visits to Austria. Including its top, the Gro*ß*glockner. I will also give a seminar in Linz the day before. In the Institut für Angewandte Statistik.

Filed under: Books, Kids, Mountains, pictures, Statistics, Travel, University life Tagged: Austria, Bayes factor, Bayesian tests of hypotheses, BAYSM, Gibbs sampling, importance sampling, label switching, Linz, mixtures, Vienna, WU Wirtschaftsuniversität Wien

### O’Bayes 2015: back in València

**T**he next O’Bayes meeting (more precisely the International Workshop on Objective Bayes Methodology, O-Bayes15), will take place in València, Spain, on June 1-4, 2015. This is the second time an O’Bayes conference takes place in València, after the one José Miguel Bernardo organised in 1998 there. The principal objectives of O-Bayes15 will be to facilitate the exchange of recent research developments in objective Bayes theory, methodology and applications, and related topics (like limited information Bayesian statistics), to provide opportunities for new researchers, and to establish new collaborations and partnerships. Most importantly, O-Bayes15 will be dedicated to our friend Susie Bayarri, to celebrate her life and contributions to Bayesian Statistics. Check the webpage of O-Bayes15 for the program (under construction) and the practical details. Looking forward to the meeting and hopeful for a broadening of the basis of the O’Bayes community and of its scope!

Filed under: pictures, Statistics, Travel, University life Tagged: José Miguel Bernardo, O-Bayes 2015, objective Bayes, Spain, Susie Bayarri, Valencia conferences

### Scottish polls…

**A**s much as I love Scotland, or because of it, I would not dream of suggesting to Scots that one side of the referendum sounds better than the other. However, I am rather annoyed at the yoyo-like reactions to the successive polls about the result, because, just like during the US elections, each poll is analysed separately rather than being pooled with the earlier ones in a reasonable meta-analysis… Where is Nate Silver when we need him?!

Filed under: pictures, Statistics, Travel Tagged: elections, Glasgow, Hillhead, independence, Nate Silver, poll, Scotland, Scottish independence referendum, United Kingdom

### random generators… unfit for ESP testing?!

*“The term *psi* denotes anomalous processes of information or energy transfer that are currently unexplained in terms of known physical or biological mechanisms.”*

**W**hen re-reading [in the taxi to Birmingham airport] Bem’s piece on “significant” ESP tests, I came upon the following hilarious part that I could not let pass:

*“For most psychological experiments, a random number table or the random function built into most programming languages provides an adequate tool for randomly assigning participants to conditions or sequencing stimulus presentations. For both methodological and conceptual reasons, however, psi researchers have paid much closer attention to issues of randomization. *

*At the methodological level, the problem is that the random functions included in most computer languages are not very good in that they fail one or more of the mathematical tests used to assess the randomness of a sequence of numbers (L’Ecuyer, 2001), such as Marsaglia’s rigorous Diehard Battery of Tests of Randomness (1995). Such random functions are sometimes called pseudo random number generators (PRNGs) because they [are] not random in the sense of being indeterminate because once the initial starting number (the seed) is set, all future numbers in the sequence are fully determined.”*

**W**ell, pseudo-random generators included in all modern computer languages that I know have passed tests like diehard. It would be immensely useful to learn of counterexamples as those using the corresponding language should be warned!!!

*“In contrast, a hardware-based or “true” RNG is based on a physical process, such as radioactive decay or diode noise, and the sequence of numbers is indeterminate in the quantum mechanical sense. This does not in itself guarantee that the resulting sequence of numbers can pass all the mathematical tests of randomness (…) Both Marsaglia’s own PRNG algorithm and the “true” hardware-based Araneus Alea I RNG used in our experiments pass all his diehard tests (…) At the conceptual level, the choice of a PRNG or a hardware-based RNG bears on the interpretation of positive findings. In the present context, it bears on my claim that the experiments reported in this article provide evidence for precognition or retroactive influence.”*

**T**here is no [probabilistic] validity in the claim that hardware random generators are more random than pseudo-random ones. Hardware generators may be unpredictable even by the hardware conceptor, but the only way to check they produce generations from a uniform distribution follows exactly the same pattern as for PRNG. And the lack of reproducibility of the outcome makes it impossible to check the reproducibility of the study. But here comes the best part of the story!

*“If an algorithm-based PRNG is used to determine the successive left-right positions of the target pictures, then the computer already “knows” the upcoming random number before the participant makes his or her response; in fact, once the initial seed number is generated, the computer implicitly knows the entire sequence of left/right positions. As a result, this information is potentially available to the participant through real-time clairvoyance, permitting us to reject the more extraordinary claim that the direction of the causal arrow has actually been reversed.”*

**E**xtraordinary indeed… But not more extraordinary than conceiving that a [psychic] participant in the experiment may “see” the whole sequence of random numbers!

*“In contrast, if a true hardware-based RNG is used to determine the left/right positions, the next number in the sequence is indeterminate until it is actually generated by the quantum physical process embedded in the RNG, thereby ruling out the clairvoyance alternative. This argues for using a true RNG to demonstrate precognition or retroactive influence. But alas, the use of a true RNG opens the door to the psychokinesis interpretation: The participant might be influencing the placement of the upcoming target rather than perceiving it, a possibility supported by a body of empirical evidence testing psychokinesis with true RNGs (Radin, 2006, pp.154–160).” *

**G**ood! I was just about to make the very same objection! If someone can predict the whole sequence of [extremely long integer] values of a PRNG, it gets hardly any more irrational to imagine that he or she can mentally impact a quantum mechanics event. (And hopefully save Schröninger’s cat in the process.) Obviously, it begs the question as to how a subject could forecast a location of the picture that depends on the random generation but not forecast the result of the random generation.

*“Like the clairvoyance interpretation, the psychokinesis interpretation also permits us to reject the claim that the direction of the causal arrow has been reversed. Ironically, the psychokinesis alternative can be ruled out by using a PRNG, which is immune to psychokinesis because the sequence of numbers is fully determined and can even be checked after the fact to confirm that its algorithm has not been perturbed. Over the course of our research program—and within the experiment just reported—we have obtained positive results using both PRNGs and a true RNG, arguably leaving precognition/reversed causality the only nonartifactual interpretation that can account for all the positive results.”*

**T**his is getting rather confusing. Avoid using a PRNG for fear the subject infers about the sequence and avoid using a RNG for fear of the subject tempering with the physical generator. An omniscient psychic would be able to hand both types of generators, wouldn’t he or she!?!

* “This still leaves open the artifactual alternative that the output from the RNG is producing inadequately randomized sequences containing patterns that fortuitously match participants’ response biases*.”

**T**his objection shows how little confidence the author has in the randomness tests he previously mentioned: a proper random generator is not *inadequately* *randomized*. And if chance only rather than psychic powers is involved, there is no explanation for the match with the participants’ response. Unless those participants are so clever as to detect the flaws in the generator…

*“In the present experiment, this possibility is ruled out by the twin findings that erotic targets were detected significantly more frequently than randomly interspersed nonerotic targets and that the nonerotic targets themselves were not detected significantly more frequently than chance. Nevertheless, for some of the other experiments reported in this article, it would be useful to have more general assurance that there are not patterns in the left/right placements of the targets that might correlate with response biases of participants. For this purpose, Lise Wallach, Professor of Psychology at Duke University, suggested that I run a virtual control experiment using random inputs in place of human participants.”*

**A**bsolutely brilliant! This test replacing the participants with random generators has shown that the subjects’ answers do not correspond to an iid sequence from a uniform distribution. It would indeed require great psychic powers to reproduce a perfectly iid U(0,1) sequence! And the participants were warned about the experiment so naturally expected to see patterns in the sequence of placements.

Filed under: Books, Statistics Tagged: Birmingham, DieHard, ESP, hardware random generator, PRNG, pseudo-random generator, random simulation, randomness

### ABC@NIPS: call for papers

*In connection with the previous announcement of ABC in Montréal, a call for papers that came out today:*

NIPS 2014 Workshop: ABC in Montreal

**December 12, 2014**

** Montréal, Québec, Canada**

Approximate Bayesian computation (ABC) or likelihood-free (LF) methods have developed mostly beyond the radar of the machine learning community, but are important tools for a large segment of the scientific community. This is particularly true for systems and population biology, computational psychology, computational chemistry, etc. Recent work has both applied machine learning models and algorithms to general ABC inference (NN, forests, GPs) and ABC inference to machine learning (e.g. using computer graphics to solve computer vision using ABC). In general, however, there is significant room for collaboration between the two communities.

The workshop will consist of invited and contributed talks, poster spotlights, and a poster session. Rather than a panel discussion we will encourage open discussion between the speakers and the audience!

Examples of topics of interest in the workshop include (but are not limited to):

* Applications of ABC to machine learning, e.g., computer vision, inverse problems

* ABC in Systems Biology, Computational Science, etc

* ABC Reinforcement Learning

* Machine learning simulator models, e.g., NN models of simulation responses, GPs etc.

* Selection of sufficient statistics

* Online and post-hoc error

* ABC with very expensive simulations and acceleration methods (surrogate modeling, choice of design/simulation points)

* ABC with probabilistic programming

* Posterior evaluation of scientific problems/interaction with scientists

* Post-computational error assessment

* Impact on resulting ABC inference

* ABC for model selection

===========

**Submission:**

===========

We invite submissions in NIPS 2014 format with a maximum of 4 pages, excluding references. Anonymity is not required. Relevant works that have been recently published or presented elsewhere are allowed, provided that previous publications are explicitly acknowledged. Please submit papers in PDF format to abcinmontreal@gmail.com .

===============

**ISBA@NIPS**

===============

This workshop has been endorsed by ISBA. As part of their sponsorship, ISBA will be awarding a limited number of travel awards to PhD students and young researchers. The organizing committee may nominate particularly strong submissions for this award.

In addition to the general ISBA endorsement, ABC in Montréal has been endorsed by the BayesComp section of ISBA.

================

**Important Dates:**

================

Submission Deadline: October 9, 2014

Author Notification: October 26, 2014

Workshop: December 12 or 13, 2014

=================

**Invited Speakers:**

=================

Michael Blum, Laboratoire TIMC-IMAG, Grenoble

Juliane Liepe, Imperial College London

Vikash Mansinghka, MIT

Frank Wood, Oxford

===========

**Organizers:**

===========

Neil Lawrence, University of Sheffield

Ted Meeds, University of Amsterdam

Christian Robert, Université Paris-Dauphine

Max Welling, University of Amsterdam

Richard Wilkinson, University of Nottingham

**Contact:**

The organizers can be contacted at abcinmontreal@gmail.com.

Filed under: Statistics, Travel, University life Tagged: ABC, BayesComp, Canada, ISBA@NIPS, likelihood-free methods, machine learning, Montréal, NIPS 2014, Québec, simulation

### single variable transformation approach to MCMC

I read the newly arXived paper “On Single Variable Transformation Approach to Markov Chain Monte Carlo” by Dey and Bhattacharya on the pleasant train ride between Bristol and Coventry last weekend. The paper actually follows several earlier papers by the authors that I have not read in detail. The notion of single variable transform is to add plus or minus the same random noise to all components of the current value of the Markov chain, instead of the standard d-dimensional random walk proposal of the reference Metropolis-Hastings algorithm, namely all proposals are of the form

meaning the chain proceeds [after acceptance] along *one and only one* of the d diagonals. The authors’ arguments are that (a) the proposal is cheaper and (b) the acceptance rate is higher. What I find questionable in this argument is that this does not directly matter in the evaluation of the performances of the algorithm. For instance, higher acceptance in a Metropolis-Hasting algorithm does not imply faster convergence and smaller asymptotic variance. (This goes without mentioning the fact that the comparative Figure 1 is so variable with the dimension as to be of limited worth. Figure 1 and 2 are also found in an earlier arXived paper of the authors.) For instance, restricting the moves along the diagonals of the Euclidean space implies that there is a positive probability to make two successive proposals along *the same* diagonal, which is a waste of time. When considering the two-dimensional case, joining two arbitrary points using an everywhere positive density g upon ε means generating two successive values from g, which is equivalent cost-wise to generating a single noise from a two-dimensional proposal. Without the intermediate step of checking the one-dimensional move along one diagonal. So much for a gain. In fine, the proposal found in this paper sums up as being a one-at-a-time version of a standard random walk Metropolis-Hastings algorithm.

Filed under: Books, Statistics, Travel Tagged: arXiv, asymptotic variance, Metropolis-Hastings, mixing speed, random walk

### independent component analysis and p-values

**L**ast morning at the neuroscience workshop Jean-François Cardoso presented independent component analysis though a highly pedagogical and enjoyable tutorial that stressed the geometric meaning of the approach, summarised by the notion that the (ICA) decomposition

of the data X seeks both independence between the columns of S *and* non-Gaussianity. That is, getting as away from Gaussianity as possible. The geometric bits came from looking at the Kullback-Leibler decomposition of the log likelihood

where the expectation is computed under the true distribution P of the data X. And Qθ is the hypothesised distribution. A fine property of this decomposition is a statistical version of Pythagoreas’ theorem, namely that when the family of Qθ‘s is an exponential family, the Kullback-Leibler distance decomposes into

where θ⁰ is the expected maximum likelihood estimator of θ. (We also noticed this possibility of a decomposition in our Kullback-projection variable-selection paper with Jérôme Dupuis.) The talk by Aapo Hyvärinen this morning was related to Jean-François’ in that it used ICA all the way to a three-level representation if oriented towards natural vision modelling in connection with his book and the paper on unormalised models recently discussed on the ‘Og.

**O**n the afternoon, Eric-Jan Wagenmaker [who persistently and rationally fight the (ab)use of p-values and who frequently figures on Andrew's blog] gave a warning tutorial talk about the dangers of trusting p-values and going fishing for significance in existing studies, much in the spirit of Andrew’s blog (except for the defence of Bayes factors). Arguing in favour of preregistration. The talk was full of illustrations from psychology. And included the line that ESP testing is the jester of academia, meaning that testing for whatever form of ESP should be encouraged as a way to check testing procedures. If a procedure finds a significant departure from the null in this setting, there is something wrong with it! I was then reminded that Eric-Jan was one of the authors having analysed Bem’s controversial (!) paper on the “anomalous processes of information or energy transfer that are currently unexplained in terms of known physical or biological mechanisms”… (And of the shocking talk by Jessica Utts on the same topic I attended in Australia two years ago.)

Filed under: pictures, Running, Statistics, Travel, University life Tagged: Australia, Bayes factor, computational vision, ESP, evidence, exponential family, ICA, independent component analysis, Kullback-Leibler divergence, normalising constant, p-values, Pythagorean theorem, statistical geometry, statistical significance

### stop culling the Alps bouquetins!

**I** just learned today that about 300 bouquetins had been killed in the French Alps the past few days as an hasty and ungrounded measure against bovine brucellosis. I find it amazing that the local authorities can act with so little scientific justification and against European regulations that make bouquetins a protected species. In comparison, the proposed culling of badgers in England went through experimental steps with some modicus of science. (Although it is supposed to resume next week despite Gareth’s recent ABC paper demonstrating culling is ineffective against bovine TB.)

Filed under: Mountains, pictures Tagged: Alps, Badger Trust, badgers, bouquetins, culling

### Half a king and less of a story…

**A**s ‘Og’s readers may have noticed, I have very much appreciated Joe Abercombie’s novels and style so far, having read and reviewed all of his books. Hence, I was expecting something altogether different out of *Half a King*, his latest novel… Compared with the books written so far, this one feels too light, too easy-going, too much of a one-shot read, too linear and too predictable, with none of the shadows and shortcomings and other moral ambiguities crossing everyone and all in the novel. And making Abercrombie such a special author. The main character Yari is not very enticing and the way he gets out of dramatic situations is not particularly convincing. Nor particularly on the moral high ground (not surprising, this, considering Abercrombie’s style!) But it sounds as if this remains justified as lesser evil against greater evil… The final stages of the story are just too impossible to believe. So this book is a real disappointment. After reading the book in a few hours in Bristol, a few miles from the author who lives in Bath, I went hunting for reactions on the Internet and found out that this was a young adult novel, which may explain for the lack of depth and of moral ambiguity. I wish this had been spelled out more clearly before I had bought the book! (As an aside I wonder why Abercrombie has this fascination with maimed hands throughout his novels. From The Ninefinger in the early novel to this half king with only two fingers on his right hand.)

Filed under: Books, Travel Tagged: Half a King, Joe Abercrombie, ninefinger, young adult books

### this issue of Series B

**T**he September issue of [JRSS] Series B I received a few days ago is of particular interest to me. (And not as an ex-co-editor since I was never involved in any of those papers!) To wit: a paper by Hani Doss and Aixin Tan on evaluating normalising constants based on MCMC output, a preliminary version I had seen at a previous JSM meeting, a paper by Nick Polson, James Scott and Jesse Windle on the Bayesian bridge, connected with Nick’s talk in Boston earlier this month, yet another paper by Ariel Kleiner, Ameet Talwalkar, Purnamrita Sarkar and Michael Jordan on the bag of little bootstraps, which presentation I heard Michael deliver a few times when he was in Paris. (Obviously, this does not imply any negative judgement on the other papers of this issue!)

For instance, Doss and Tan consider the multiple mixture estimator [my wording, the authors do not give the method a name, referring to Vardi (1985) but missing the connection with Owen and Zhou (2000)] of k ratios of normalising constants, namely

where the z’s are the normalising constants and with possible different numbers of iterations of each Markov chain. An interesting starting point (that Hans Künsch had mentioned to me a while ago but that I had since then forgotten) is that the problem was reformulated by Charlie Geyer (1994) as a quasi-likelihood estimation where the ratios of all z’s relative to one reference density are the unknowns. This is doubling interesting, actually, because it restates the constant estimation problem into a statistical light and thus somewhat relates to the infamous “paradox” raised by Larry Wasserman a while ago. The novelty in the paper is (a) to derive an optimal estimator of the ratios of normalising constants in the Markov case, essentially accounting for possibly different lengths of the Markov chains, and (b) to estimate the variance matrix of the ratio estimate by regeneration arguments. A favourite tool of mine, at least theoretically as practically useful minorising conditions are hard to come by, if at all available.

Filed under: Books, Statistics, Travel, University life Tagged: bag of little bootstraps, Bayesian bridge, Bayesian lasso, JRSSB, marginal likelihood, Markov chain Monte Carlo, normalising constant, Series B, simulation, untractable normalizing constant, Wasserman's paradox

### statistical challenges in neuroscience

**Y**et another workshop around! Still at Warwick, organised by Simon Barthelmé, Nicolas Chopin and Adam Johansen on the theme of statistical aspects of neuroscience. Being nearby I attended a few lectures today but most talks are more topical than my current interest in the matter, plus workshop fatigue starts to appear!, and hence I will keep a low attendance for the rest of the week to take advantage of my visit here to make some progress in my research and in the preparation of the teaching semester. (Maybe paradoxically I attended a non-neuroscience talk by listening to Richard Wilkinson’s coverage of ABC methods, with an interesting stress on meta-models and the link with computer experiments. Given that we are currently re-revising our paper with Matt Moore and Kerrie Mengersen (and now Chris Drovandi), I find interesting to see a sort of convergence in our community towards a re-re-interpretation of ABC as producing an approximation of the distribution of the summary statistic itself, rather than of the original data, using auxiliary or indirect or pseudo-models like Gaussian processes. (Making the link with Mark Girolami’s talk this morning.)

Filed under: Books, pictures, Statistics, Travel Tagged: ABC, computer experiment model, Gaussian processes, indirect inference, neurosciences, University of Warwick, workshop

### Warwick campus

Filed under: pictures, Travel, University life Tagged: England, heron, mathematics, Statistics, summer, University of Warwick

### big data, big models, it is a big deal! [posters & talks]

**G**reat poster session yesterday night and at lunch today. Saw an ABC poster (by Dennis Prangle, following our random forest paper) and several MCMC posters (by Marco Banterle, who actually won one of the speed-meeting mini-project awards!, Michael Betancourt, Anne-Marie Lyne, Murray Pollock), and then a rather different poster on Mondrian forests, that generalise random forests to sequential data (by Balaji Lakshminarayanan). The talks all had interesting aspects or glimpses about big data and some of the unnecessary hype about it (them?!), along with exposing the nefarious views of Amazon to become the Earth only seller!, but I particularly enjoyed the astronomy afternoon and even more particularly Steve Roberts sweep through astronomy machine-learning. Steve characterised variational Bayes as picking your choice of sufficient statistics, which made me wonder why there were no stronger connections between variational Bayes and ABC. He also quoted the book The Fourth Paradigm: Data-Intensive Scientific Discovery by Tony Hey as putting forward interesting notions. (A book review for the next vacations?!) And also mentioned zooniverse, a citizens science website I was not aware of. With a Bayesian analysis of the learning curve of those annotating citizens (in the case of supernovae classification). Big deal, indeed!!!

Filed under: Books, Kids, pictures, Statistics, Travel, University life Tagged: ABC, Amazon, astronomy, astrostatistics, big data, conference, England, galaxies, pulsars, Statistics, supernovae, The Fourth Paradigm, The Large Synoptic Survey Telescope, University of Warwick, variational Bayes methods, workshop

### ISBA@NIPS

*[An announcement from ISBA about sponsoring young researchers at NIPS that links with my earlier post that our ABC in Montréal proposal for a workshop had been accepted and a more global feeling that we (as a society) should do more to reach towards machine-learning.]
*

**T**he International Society for Bayesian Analysis (ISBA) is pleased to announce its new initiative *ISBA@NIPS*, an initiative aimed at highlighting the importance and impact of Bayesian methods in the new era of data science.

Among the first actions of this initiative, ISBA is endorsing a number of *Bayesian satellite workshops* at the Neural Information Processing Systems (NIPS) Conference, that will be held in Montréal, Québec, Canada, December 8-13, 2014.

Furthermore, a special ISBA@NIPS Travel Award will be granted to the best Bayesian invited and contributed paper(s) among all the ISBA endorsed workshops.

ISBA endorsed workshops at NIPS

- ABC in Montréal. This workshop will include topics on: Applications of ABC to machine learning, e.g., computer vision, other inverse problems (RL); ABC Reinforcement Learning (other inverse problems); Machine learning models of simulations, e.g., NN models of simulation responses, GPs etc.; Selection of sufficient statistics and massive dimension reduction methods; Online and post-hoc error; ABC with very expensive simulations and acceleration methods (surrogate modelling, choice of design/simulation points).
- Networks: From Graphs to Rich Data. This workshop aims to bring together a diverse and cross-disciplinary set of researchers to discuss recent advances and future directions for developing new network methods in statistics and machine learning.
- Advances in Variational Inference. This workshop aims at highlighting recent advancements in variational methods, including new methods for scalability using stochastic gradient methods, , extensions to the streaming variational setting, improved local variational methods, inference in non-linear dynamical systems, principled regularisation in deep neural networks, and inference-based decision making in reinforcement learning, amongst others.
- Women in Machine Learning (WiML 2014). This is a day-long workshop that gives female faculty, research scientists, and graduate students in the machine learning community an opportunity to meet, exchange ideas and learn from each other. Under-represented minorities and undergraduates interested in machine learning research are encouraged to attend.

ISBA@NIPS Travel Award

The ISBA Program Council will grant two ISBA special Travel Award to two selected young participants, one in the category of *Invited Paper* and one in the category of *Contributed Paper*. Each Travel Award will be of at most 1000 USD. Organisers of ISBA endorsed workshops at NIPS are all invited to propose candidates.

**Eligibility**

- Only participants of ISBA-endorsed Workshops at NIPS will be considered.
- The recipients should be graduate students or junior researchers (up to five years after graduation) presenting at the workshop.
- The recipients should be ISBA members at the moment of receiving the award.

**Application procedure**

The organizers of ISBA-endorsed Workshops at NIPS who wish to apply, select one or two candidates and postulate them as candidates to the ISBA Program Council by no later than:

- September the 5th, 2014 (for the category of Invited Paper)
- October the 29th, 2014 (for the category of Contributed Paper)

The ISBA Program Council selects the two winners among the candidates proposed by all ISBA-endorsed Workshops. The outcome of the above procedure will be communicated to the Workshop Organisers by no later than:

- September the 9th, 2014 (for the category of Invited Paper)
- November the 7th, 2014 (for the category of Contributed Paper)

The winners will present a special ISBA@NIPS Travel Award recipient’s seminar at the workshops at NIPS.

Filed under: Statistics, Travel, University life Tagged: ABC in Montréal, Canada, graphical models, ISBA, machine learning, Montréal, NIPS 2014, Québec, travel award, variational Bayes methods

### big data, big models, it is a big deal!

Filed under: pictures, Statistics, University life Tagged: Amazon, big data, conference, England, Statistics, University of Warwick, workshop