September 27: After a fairly long and uneventful Wednesday, which I spent mostly hanging out working on bolometric light curve stuff, Thursday whizzed by in a flurry of activity.
Brief meeting with Rosanne and Alex
In the morning I briefly met with Rosanne and Alex again. It looks like Alex is finding some interesting objects in the data set Rosanne gave him, so they may have something to write follow-up proposals about. It should be easy enough to write some cross-matching code using the Python libraries I’ve been developing for the SkyMapper search, once I have a moment. We went back over the question of distant/hostless weirdo SNe once more; I need to talk to Mike Childress back at the ranch regarding what we could learn about these SNe from their locations relative to their (bright, nearby, interestingly-structured) host galaxies.
Kailash Sahu’s talk
Kailash Sahu gave the morning colloquium, on using HST to learn about the mass distribution of stellar remnants via gravitational microlensing. It was a brilliant talk about a very elegant method: If you only look at the lensing magnification, you see the lensed star brighten and fade symmetrically over time. But you usually can’t learn a whole lot about the lens because there are four degenerate parameters you need to constrain (lens mass ML, lens transverse velocity vL, distance to lens DL, distance to lensed source DS): different combinations of these underlying parameters produce lensing events that look very similar to each other. However, the lensing event also induces astrometric variation (small changes in the apparent position of the lensed source) and “photometric parallax” (a small modulation in the lensing magnification, caused by small changes in the lensing geometry due to the Earth’s motion). Detecting these additional variations help break the degeneracies and let you determine ML, with some precision. You need the sharp resolution of a space telescope to do this kind of work, although it could conceivably be possible with wide-field adaptive optics (ha!), such as the new MCAO system now being commissioned for Gemini-South.
After this introduction, Sahu then went further and pointed out that since his intensive follow-up programs were triggered off the ground-based lensing event detections, there had to be uncontrolled selection biases in the results. It sounds like a similar program to that of triggering detailed follow-up of unusual supernovae based on search observations: you don’t have detailed information about the target from before the trigger, and the kinds of targets you find will necessarily be biased depending on your judgment about which ones are interesting enough to merit follow-up. But if you could do a “rolling search” for microlensing events, monitoring a single area evenly with HST, then your discovery and follow-up would be part of the same program and the biases would be much reduced. This is the focus of a large program to use a few hundred orbits of HST time over the next couple of years to find a sample of some 40 interesting events.
My own lunch talk
After that, we had a selection of lunch talks. I don’t know where the ITC gets the money to pay for the near-infinite selection of free food at these lunches, but I’m not complaining.
We had three other talks besides my own:
- Xuening Bai (CfA) talked about the transport of angular momentum in numerical simulations of accretion disks, a critical problem throughout astrophysics. (If you can’t shed angular momentum from accreting material, it will just continue to orbit instead of actually accreting onto the central object. So in any case where you have reason to believe the material actually falls in, you have to somehow slow down its orbital motion.) Not much attention is given to accretion disks which are threaded with large-scale magnetic fields, simply because these are numerically unstable and disintegrate catastrophically within a few orbits! But they might still exist in nature, so the speaker outlined some methods by which you could get a handle on how these things behave.
- Sahu gave another short talk reviewing progress in detecting planets through microlensing. This is of course related to his other work, but you need a much tighter cadence to detect planets than you need to search for black holes, as discussed in his colloquium from the morning.
- Carlos Cunha (Stanford) discussed some technical-sounding details of getting photometric redshifts from galaxies in large cosmology surveys like DES and LSST. Remember that the redshift is usually defined by looking for wavelength-shifted lines of certain chemical elements in a galaxy’s spectrum. If your giant survey doesn’t have time or resources to take a spectrum of every galaxy, you try to come up with ways of deducing the redshift from photometric observations in different filters (a much cruder spectrum). It turns out that for some purposes, like learning about dark energy, this is just not good enough: the accuracy of the redshift you derive is so poor that you can severely bias your results, and so you have to be careful about calibrating techniques like this.
I went third. My delivery went well, I finished well within my time, and people seemed interested in what I had to say.
Bob Kirshner heckled me a bit about the accuracy of SNfactory spectrophotometry: while he seemed interested in what I was trying to do, he wondered whether large photometric errors might not dominate the error in my ejected mass measurements, or whether those errors might be much larger than what I was reporting.
This is a well-known, ongoing public image problem for SNfactory, because the SNfactory spectra have still not been published and so nobody has been able to compare our results to those of other groups for well-observed supernovae, thus verifying our claims to precision and/or accuracy. There isn’t much I can do about it, since the SNfactory collaboration board keeps tight control over how and when to publish data. For the time being, I just reminded Bob that we had published both spectrophotometry and imaging photometry for SN 2007if, and that they agreed well within our estimated errors, so that should provide an indicative case. As I said at last week’s LPNHE group meeting, I think the 56Ni distribution presents a much bigger potential uncertainty for what I’m doing.
I went back to Bob’s office for a longer chat with him and with a new Ph.D. student shopping for projects, so I got to hear a lot about what Bob’s group has been doing lately (most of which I knew about already because I follow them pretty closely). This is good, since I knew I was going to miss his talk later in the evening; I had to run and catch my train to New Haven.
Bob suggested that I try my mass reconstruction technique on a subset of the CfA supernova group’s data set. I’m keen to turn my code loose on literature supernovae eventually, since there are plenty of public data well-suited for what I want to do; but for now I’m sticking to the SNfactory data set. Bob also seems to think that using Si II velocity information will be helpful to constrain the density structure of each supernovae. There’s useful information there, but I think that for most supernovae, interpreting Si II velocity information correctly will probably require complex numerical simulations, like the ones my RSAA colleague Stuart runs.
After some quick words with Ryan Foley and with Sahu, I gave back my key and hightailed it for the Red Line to South Station.