Best Python resources for 2019

After listening to Talk Python to Me 194, I started thinking about what I would recommend to people who are getting started with python. It's a complicated question, actually, because it depends on what this hypothetical neophyte intends to do, and more importantly, whether she/he is interested in getting nerdy about python or just wants to know enough to get it done. 
In any case, it doesn't really matter, because lists like this aren't really for this hypothetical person. This is a list of stuff I like, and that I recommend. So let's not get caught up in some conceit, and just get to a list of good python resources.
First, I find a lot of utility in the posts at RealPython.com. Each post is based on a python package or a feature and gives a fairly detailed walk through of that topic. They aren't always relevant for my own uses, but I even read some of those to get a better idea of how the other half live. Other good sites that are worth visiting include FullStackPython.comPython.org, and SciPy.org. The latter two lead to the official documentation, which is sometimes (not always) useful for learning about the details. I've also got a news feed set up for python-related content (using Feedly), and much of the content actually comes through PlanetPython.org; it's more of a mixed bag but I find and read enough content through it that it is worth mentioning.
I listen to two podcasts focused on python, both hosted by Michael Kennedy: Talk Python To Me and Python Bytes. The first is an interview show, usually with a developer or someone who works in some corner of the python world. The second is a weekly python news show, and they really do a great job of finding the breaking python news and reporting on new and interesting projects. I recommend both, but Python Bytes seems nearly essential at this point. There are other python podcasts, but I haven't gotten into regularly listening to any of them. I think that's because the ones I've heard get a little too into details that I don't care about, especially about web development (a topic that I know nothing about and I get lost very quickly).
On the video front, there are a couple of content providers worth subscribing to. First is Dan Bader's videos; he's the now runs RealPython.com, too, and his videos are well produced, reasonable length tutorials that are usually on some feature of standard python. I think his publishing frequency as decreased since moving to RealPython.com, but some of his videos are worth revisiting. I also highly recommend the videos by Corey Schafer. He provides more step-by-step tutorials, and does a great job of stepping through and building up lessons. I also keep an eye on Coding Tech, which often has videos on programming that are interesting. I also watch a lot of the recorded presentations from python conferences: PyCon, PyData, SciPy especially, but others as well (listPyVideo.org). I also think the Socratica python videos are good, if a bit hoakey. Finally, Enthought publishes a lot of useful videos, and seem to be responsible for the SciPy conference talks.
Finally, tools for leveling up your python skills. There are tons of resources out there, and I have not tried all of them. I have experimented, and I've returned to a few of them. First is ProjectEuler.net. It isn't a python site; it really isn't even a coding site, so much as an algorithm and math site. It's just a list of problems, you have to solve the problem in input the answer; that unlocks a discussion board about that problem where you can see other solutions and sometimes detailed "official" solutions with explanation. I haven't done very many of the problems still, but I find them to be more fun to do than other similar sites. A more sophisticated version of the same concept is CodeSgnal, which used to be called CodeFights. The problems are much more computer science focused, and geared toward developers and want-to-be developers getting ready for interviews. The fun thing is that the site provides a full programming environment and your whole code is supplied and run to get the solutions. It is well designed and many of the problems that I've done have been fun. You can choose from many languages, so it is not a python-specific resource. CodeSignal seems to be nearly the same idea as CodeWars, which I have not tried. A new one to keep an eye on is Coder.com; this is VSCode in the browser with access to computational resources. I'm not sure where this is going, but it seems well-made and interesting. 
Okay, finally, finally is the omnipresent internet resource for coding things: stackoverflow. I don't really like SO in many ways. The main ones are (a) I don't think people do a good job of asking their questions, and (b) people get pedantic (or just jerky) about answering questions. That said, a lot of the answers are very useful, and the best ones provide great guidance and insight about how things work.


In-place upgrade from python 3.6 to 3.7

Based on reports Python Bytes and from [here] and [there] it seems like 3.7 is generally faster than 3.6. So, I decided to try it. On one machine, I set up a fresh conda environment with 3.7 and installed all the packages I typically use. The first time I did that, which was months ago, not everything was working, and I put this upgrade plan on hold. Later, I re-tested, and all my packages seemed to be playing nicely with 3.7. I worked in that environment for a while with no problems.

During down time today, I thought it might be good to move another machine to 3.7. This time I decided to take the leap and move my base environment to python 3.7 from 3.6.6. Why not?

There is one step:
$ conda install python=3.7

This takes some time for conda to "solve" the environment. I'm not sure what this actually does, but since it checks for dependencies, it is no wonder that it will take a while because essentially every installed package will need to be removed and reinstalled.

One potential gotcha with this approach is that anything that was pip installed will need to be reinstalled. I think there are a couple of these, but I don't know how to tell which are which. Oh well, I guess I'll find out when something breaks. 

Eventually the environment does get solved, and a plan is constructed. Answer 'y' and conda dutifully downloads and extracts many packages.

Conda does all the work:
Preparing transaction step happens with the spinning slash, and finishes.
Verification step happens with the spinning slash, and finishes.
Removing some deprecated stuff (jupyter js widgets) ...  And then enabling notebook extensions and validating. Give the OK.
Prints done, returns the prompt.

Did it work?

$which python

$python --version
Python 3.7.1

Python 3.7.1 | packaged by conda-forge | (default, Nov 13 2018, 09:50:42)
[Clang 9.0.0 (clang-900.0.37)] :: Anaconda custom (64-bit) on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> print(2.**8)
>>> import numpy as np
>>> import pandas as pd
>>> import matplotlib as mpl
>>> import xarray as xr
>>> xr.DataArray(np.random.randn(2, 3))

array([[-0.355778,  0.836539,  0.210377],
       [ 0.480935,  0.469618, -0.101545]])
Dimensions without coordinates: dim_0, dim_1
>>> data = xr.DataArray(np.random.randn(2, 3), coords={'x': ['a', 'b']}, dims=('x', 'y'))
>>> xr.DataArray(pd.Series(range(3), index=list('abc'), name='foo'))

array([0, 1, 2])
  * dim_0    (dim_0) object 'a' 'b' 'c'

Okay, this seems to be working. Repeated similar interactive test with ipython. So far, so good.

Lesson: conda is kind of amazing.


On Microsoft acquiring GitHub

In the news today is that Microsoft is acquiring GitHub (Pocket Link to Verge). That's a big deal for a lot of big (and small) open source projects. It's definitely going to rub a lot of open-source developers the wrong way, as many are motivated to contribute to open source projects as a direct response to decades of difficulties with Microsoft.

I am not a Microsoft user, nor a fan. But I do acknowledge that there's been some apparent improvements over the past few years under CEO Satya Nadella. They have some open projects that are notable, especial VS Code, which is becoming very popular. From what I hear, Windows is still a monstrosity that should not be used, but it is clear that Windows 10 is improved over the last few versions; note, last time I really used windows was still Windows 98, I think.

My suspicion is that Microsoft wants GitHub because they want to use it internally for very large projects. Recently Microsoft has been working to make git more useful for humungous projects, specifically with the virtual file system (techcrunch). By controlling GitHub, Microsoft becomes the biggest player in git as well, which I'm sure won't please some.

It's also worth noting that GitHub developed the text editor Atom, which is pretty similar to their aforementioned VS Code. Atom is an open source project, but it will be interesting to see what happens to Atom development going forward.

Finally, I'll also mention that GitHub has adopted Markdown as it's documentation markup language of choice. There's already a GitHub flavor of Markdown, which I think is probably the dominant version. Now that Microsoft owns GitHub, I wonder if it will impact the use of Markdown, and especially if GitHub-flavored Markdown will further evolve away from the original?


What is aerosol radiative forcing?

There is a lot of research about the climate impact of aerosols. One of the fundamental measures of the climate impact is the "radiative forcing" associated with aerosols. It's not obvious what exactly aerosol radiative forcing is, however, so here we begin our examination of this question.

The IPCC AR4 provides a nearly useless description: [LINK]

We can discern two important facets of aerosol radiative forcing from that description:
  1. It is measured based on top-of-atmosphere (TOA) radiative fluxes. 
  2. It includes the impact of aerosol on clouds. 
It is also useful to look at other parts of AR4, where better text describes aerosol effects. I started at the link above because when I search for "aerosol radiative forcing" that is one of the top hits I get. That's an unfortunate hit because the text surrounding that small section is much more informative.  

The first thing that can be clarified is that #1 above is part of the definition of radiative forcing. As far as IPCC reports go, radiative forcing is the impact that a forcing agent has on the net TOA fluxes. The concept is useful because it is derived from the basic physics of conservation of energy and thermodynamics. In equilibrium the net TOA flux is zero (averaged over a year, or many years). When a forcing agent is applied to they system, such as anthropogenic aerosol, the energetic consequence may be a change in that TOA balance (i.e., a radiative forcing), and having a TOA imbalance causes the system to respond. We deduce that if the forcing is negative the system will cool to achieve a new balance, but if the forcing is positive (i.e., more energy is entering the system than leaving) the system will warm to achieve a new balance. Aerosols typically fall into the negative forcing category, and so cause a cooling, but the story is not really so simple.

In particular, it is helpful to split aerosol effects into two pieces:
  1. direct effects of aerosol particles on radiative transfer through the atmosphere (scattering and absorption) (aka, aerosol-radiation interaction, ari)
  2. indirect effects of aerosol that change the radiative properties of clouds, or change the lifetime of clouds (aka, aerosol-cloud interaction, aci)
The IPCC AR5 [LINK] includes a lot of treatment of aerosol radiative forcing. Since it's newer, perhaps we should focus there for some clarity on this issue. An important distinction is drawn in AR5 between radiative forcing (RF) and effective radiative forcing (ERF). While RF is just what we were describing, namely the change in the TOA net flux (allowing adjustment of the stratosphere), but ERF allows the troposphere to also adjust to the forcing agent. To establish ERF is tricky because it does not allow the global average surface temperature to adjust; the idea is that ERF includes tropospheric "rapid adjustments" to occur, while RF only allows for the rapid stratospheric adjustment to occur. Confused yet? 

We will return to this distinction in another post. For now, we need to consider that both direct and indirect effects have a RF but also an ERF. This complicates the picture because it further muddies the water with respect to how we describe how aerosols effect the climate system. Mostly AR5 seems to deal with RF associated direct aerosol effects and ERF for indirect effects.  For now, though, let's return to our basic question of what is aerosol radiative forcing.

Based on IPCC AR4 and AR5, along with a lot of literature reviewed therein, and also my own literature review that spans from the 1980s to today, the easiest way to express the meaning of aerosol radiative forcing is:
Aerosol radiative forcing is the change in TOA radiative fluxes between the preindustrial period and the present day. The aerosol radiative forcing can be divided into direct effects in which aerosol effects radiative transfer and indirect effects in which aerosol interacts with clouds. 
Estimates of the total direct aerosol radiative forcing is around -0.35 (-0.85 to +0.15) W m-2. Including indirect effects switches to using the ERF concept, which we will examine in another post, but the AR5 bottom line is that the total aerosol effect is a negative forcing of about -1 W m-2, but that is basically plus or minus 1 W m-2

What I want to point out before closing is that I described RF in the beginning as fundamental, but the definition that I've just provided seems far from fundamental. When we use this definition of aerosol radiative forcing, we need to define what pre-industrial means and what present day means. We know intuitively what both are supposed to mean, but quantitatively this is ambiguous. Particularly troublesome is that we do not have adequate observations from pre-industrial times to really know what the aerosol concentrations or emissions were. This provides an irreducible uncertainty for aerosol radiative forcing using this definition.  We will revisit some of these concepts in future posts, and we will return to the difficulties associated with this definition of aerosol radiative forcing.


Joseph Romm raves about Reagan, balks at Barrack: Figures of speech make and break communication

I have recently read Joseph Romm's new book, Language Intelligence,
which is really a brief review of rhetoric. It introduces modern readers to
the age-old topic of eloquent language intended to persuade
audiences. Romm uses just a few prime examples for each of the several
topics covered, from the ancient Greek greats to medieval masters who
wrote the King James Bible to modern practitioners such as Lady
Gaga. The point is to expose the principles of rhetorical discourse,
such as the various forms of repetition, irony, metaphor, and
seduction, and provide readers with some of the tools necessary to
build an effective argument as well as to erect a wall to defend
against the constant bombardment by advertisers, politicians, and
other persuaders.

The lessons are clear and well illustrated by examples. Especially
useful are the examples from recent political figures such as both
George Bushes, Bill Clinton, Barrack Obama, and Mitt Romney. Several
Republican strategists are pointed out for their cunning use of
rhetorical devices (Luntz and Rove, especially). Scientists (climate
scientists, especially) are singled out for their clumsy attempts to
communicate, usually avoiding rhetorical figures of speech. The
use of the figures being discussed occasionally becomes too blatant,
often in the final paragraphs of sections, but it is pleasing as a
reader to see such employment as sections close because it reinforces the
lesson. I am convinced that this brief introduction should be standard
reading for college students across disciplines, and those in the
sciences should pay careful attention to the lessons and employ more
intelligent language when describing their own work. Older readers
might pick up some new tricks, too, if they choose to read the book.


American Meteorological Society Statement on Climate Chnage

Posting two days in a row!?!?

I just wanted to draw attention to the updated statement on climate change from the American Meteorological Society. Here's the link: [LINK]. It is just a 7 page statement that goes through the following sections:

  • Background 
  •  How is climate changing? 
  •  Why is climate changing? 
  •  How can climate change be projected into the future? 
  •  How is the climate expected to change in the future? 
  •  Final remarks
There is nothing surprising in the statement. The AMS supports the scientific consensus that the Earth's surface and lower atmosphere are warming due to the accumulation of greenhouse gases in the atmosphere from combustion of fossil fuels and deforestation. Overall, it is well-written and straight-forward, and I recommend taking a look at it no matter what your background is. My guess is that everyone will get the gist, and if you've got any background in climate science then you'll pick up on some of the details. I'd quibble over some of the word choices here and there, but the substance is fine. Maybe they over emphasize climate models in the future section, because many of the points they make there are not based solely on model projections, but also observations and basic theory. Anyway, go take a look.


Smart meters and dumb people

As a regular listener of Coast to Coast AM, I have been aware of a conspiracy theory involving the transition from old-timey analog utility meters to internet-connected smart meters. Smart meters allow 2-way communication between a house's utility meter and the utility company. The idea is to monitor electricity use in real time (or near real time), which can allow more nimble management of the electric grid. The idea is to get electricity where it is needed when it is needed and allow better management of electricity generation. Both proponents and opponents cite the potential for tiered pricing, such as raising prices during peak energy use times. While some say this will help incentivize energy conservation, others say the tiered pricing could hurt lower income households disproportionately.

The consipracists, however, are not worried about low-income households or energy conservation.  There are really two flavors of the smart meter conspiracy. First is an irrational fear of technology that manifests as a concern about the radiation from smart meters being a health hazard. Yes, really [example]. This is not a legitimate concern, as the radiation levels are even below those of cell phones, and it is unlikely that many residents will spend significant time with their heads agains their electricity meter. The second version of the conspiracy is rooted in a deep distrust of government and an overly aggressive view of privacy. These are the people, like the ones cited in this Grist post [LINK] and the accompanying AP news article [LINK] about the smart meter opposition in Texas, who believe that the smart meters are ... well, let's just boil it down, they think that the smart meters allow the government to spy on them at home [great example, go ahead a browse this crazy site, I'll wait.]. There might be some actual privacy issues with smart meters (which that example kind of hits, but then goes to crazy), such as the potential for utilities to synthesize usage and sell the information to interested parties (who want to target their marketing efforts). This probably isn't much of a concern at this point, as it is unclear that utilities are savvy enough to profitably undertake such an analysis. Really, this comes down to some far-right-wing ideas that get mixed up by fear mongers into ridiculous conspiracy theories, encapsulated by this quote from the above cited blog, "This is all part of the radical green agenda that is being forced down the throats of people all over the world."

Maybe I should just list a couple of points that I think are relevant (in no particular order):

  1. Energy conservation is a good idea, and represents one of the "stabilization wedges" that we talk about as currently available solutions to global warming. [LINK]
  2. The radiation associated with electronics is not harmful.  [LINK]
  3. One of the criticisms that the opponents of smart meters seem to bring up often is that the FCC does not have strong enough restrictions on radiation [example]. Yet, as pointed out in Grist and the AP story above, these people tend to be on the far-right/libertarian/tea-party fringe of the political spectrum, meaning that philosophically they are opposed to government regulation (in favor of letting the "market" work out the appropriate solutions). This is completely inconsistent. I don't think this is an argument against the smart meter opposition, just a point that I wish would be discussed.
  4. The possibility of utilities selling the information aside, there seems to be a general fear of a degradation of privacy with smart meters, but I don't think there is any evidence that any personal information could be or is being collected by these meters.
  5. The transition to smart meters is being driven by the "market" as utilities try to reduce costs and maximize efficiency. This is a direct descendant of the deregulation of utilities in the USA, which right-leaning folks should be applauding (if they were being consistent with their purported economic philosophy).
  6. Because the utilities are deregulated, this information would be flowing to the utilities, and not the government. That means that there must be an extra layer of conspiracy in order to bring the government into the picture. Each additional layer of conspiracy makes the theory less and less plausible.
I think I can leave it there.

Note that in the links above that are cited as examples, I have used the "rel=nofollow" attribute which prevents search engines from following the links and improving those cites' search rankings. I decided to do that because those cites, while entertaining, do not present a useful view of the issue. The other links are normal, and represent appropriate source material.