Category Archives: MrDrFenner

Sorting Friends and Permutations

Introduction

You may be familiar with permuations: a rearrangement of a sequence. If you are, you probably know that for a sequence of length \(n\), there are \(n!\) arrangements (if the elements are all distinct – duplicates will reduce the number of unique rearrangements). There are many, many uses of permutations and many specific permutations of interest.

Continue reading

Some Iron Condor Heuristics

In the world of Iron Condors (an option position composed of short, higher strike (bear) call spread and a short, lower strike (bull) put spread), there are two heuristics that are bandied about without much comment. I’m generally unwilling to accept some Internet author’s word for something (unless it completely gels with everything I know), so I set out to convince myself that these heuristics held. Here are some demonstrations (I hesitate to use the word proof) that convinced me. Continue reading

Going All Meta (Part 2) – Some Python-Fu

In a previous post (a long, long time ago), I said I was going to talk about metaclasses (or at least show an abuse of them) in Python. I am going to get to that, but I want to set the stage by talking about another topic that isn’t nearly as black-magic-y: decorators. When I’m teaching or training, people commonly ask about decorators because they have seen them and they are confused by them – mainly because a common type of decorator is a function that takes in, modifies, and returns a different function. Huh. Back to meta-ville.

Simply put, a decorator is a Python function with some special characteristics. That Python function takes a single, lonely input. The decorator can either be a function-decorator or a class-decorator. In the first case, the decorator take a function as its input and produces a (modified) function as its output. In the second case, it takes a class as its input and produces a (modified) class as its output.

The raw notebook: Decorator Fun – Function Timing (raw)

As seen by nbviewer: Decorator Fun – Function Timing (through nbviewer)

Some Iron Condor Heuristics

I’m still debating the best way to work with ipython notebooks in this blog. However, until I come to “final” answer (which might be going away from wordpress and going with a github-pages/pellican solution, ala Jake VdP), I’m just going to (hopefully) upload the notebooks and link to them via nbviewer. Here goes:

The raw notebooks: Two Iron Condor Heuristics (Raw)

As seen by nbviewer: Two Iron Condor Heuristics (Through nbviewer)

Stomping da’ Moon

With about 6″ of snow fall in the past 24 hours, I had a great opportunity to do some stomping (my term for the clunky snow shoeing) at da’ Moon.  I really do want to write about something other than my outdoor clothing choices.  Mostly, I want to write about something else that is near and dear to me — training.  But, until then.

It turns out that medium socks (a heavier Smartwool pair), gaiters, gym pants, snow pants, a thermal shirt (“heavy, just-over-base layer” shirt), and my Patagonia Guide Softshell (with ski gloves, of course) was basically too heavy for mid-20s (mid-10s/windchill) and overcast.  I didn’t really think I was trucking, but I did cover a fair bit of ground in 1:15 or so.

I’ll close with a reminder (last mentioned on a long dead cs.pitt.edu blog) that the reason I adore snow shoeing is that I can bushwack just about anywhere.  The leaves are down.  The ground cover is carpeted with snow.  And, short of dense gaggles of branches, trees, or scrub (prickers being the only real possible “problem”), you can walk straight lines up, down, and across just about anything.

This is ultra-cool when you spend a lot of time, on a mountain bike, following pre-laid track.

Another Clothing Note

Just a quick note on winter clothing to go along with my prior post:  lower 40s and very humid/muggy/*damp*.  Riding boots, thick socks, light tights, baggy Fox shorts + chamois.  Started with medium fleece, dropped it after the first real climb.  Wore two long sleeve shirts (one light baselayer, one long sleeve downhill style jersey).  Started with skull cap plus urban helmet.  Ditched skullcap after about five minutes.  Overall, started too warm, but I didn’t want to get chilled with the dampness.  Once I warmed up though, I was quite toasty.  I also spent most of the ride pounding in my big ring.

The temperature’s a droppin’. The riding continues …

I had a great ride, mountain biking, at ‘da Moon this evening.  It was a brisk (cold) late November day:  I started at 4:00 and rode until 5:30.  The temperature was ~32F (measured at Courtdale via iPhone weather app).  Towards the, the wind picked up a bit.  The sunset was 4:45 at Kington and I swapped my semi-brown shades for my helmet light around 5:00.

Continue reading

NYC 27-Hour Date

I had the most wonderful opportunity to explore NYC for a day with my wife, MrsDrFenner.  We certainly made the best of it.  We met at Grand Central Station (yay for meeting there and not saying good-bye) and walked to The Morgan.  The Morgan had been recommended to me by my closest undergraduate mathematics professor, who happened to teach me about Euclid, Plato, and mathematical Probability & Statistics.  When we (TheDrsFenner) visited our alma mater (Allegheny College) for a reunion weekend, we got to eat dinner with Dr. LoBello and he advised us to go to the Morgan.  It was very good advice.  MrsDrFenner said she was more in awe at The Morgan than she was at MoMA (in fairness, she didn’t wait in line for the Magritte exhibit).

I was personally in awe of some of the letters of historical and literary significance that were on display.  However, I almost fell over when I saw a copy of Bryne’s Euclid‘s, open and displaying (I think) the 7th proposition (i.e., a theorem) of Book I.  Had it been open to the 47th proposition, I would have fallen right over.  #47 is the Pythagorean theorem.  I’ll try to remember to link a picture of me, beside the Bryne.  Seeing it reminded me that I’d like to take the online images for Book I and print them on a poster.  I’m not sure about sizing; I’m hoping pdfjam will make the project tolerable.  We also saw nice exhibits of Leonardo DeVinci and Edgar Allen Poe.

As we strolled out, we ducked in a coffee shop (Lucid?) for due espressi.  For there, we headed to dinner at The Cannibal.  The atmosphere was young, trendy, and communal.  Shared tables were the order of the day and it worked nicely.  There was a nice variety of beer (although, there weren’t too many must have’s for me — checking again, I see a Hill Farmstead on tap that I would have attacked).  We did really enjoy some beer-cocktails.  And the tandoori lamb belly (which might do better marketed as tandoori lamb ribs) was massively succulent.  I probably won’t get it again.  But it was great to try once!  The watermelon-cilantro-hot pepper salad really worked well to cut through the fat and provide a clean counterpoint to the heaviness of the succulent belly.

Our dinner done, we headed to two bars.  The first, Middle Branch, had a speak easy feel without requiring a password.  You do need to know where to look.  Good drinks and great atmosphere.  We really appreciated the standing room downstairs and the (uncrowded) seating area upstairs.  My riff on a Negroni (with muddled grapes) was definitely worthwhile (I’m a big Negroni and Negroni-template-riff fan).  MrsDrFenner needed something light to help her get past the heavy dinner:  our server read her mind and brought a cucumber gimlet like drink that fit the bill.  One and done:  we wanted to find some live jazz.  Which we did at Measure.  We grabbed a specialty cocktail (or two) and then transitioned to some fizzy water.  Our stomachs were in some dire need of help.

Having satisfied the requisite need to “paint the town red”, we strolled back to Grand Central and hoped a train to the Financial District (where my hotel for the meeting is located).  We decided to try for some good NYC brunch in the AM.  We took a good bit of a walk to get to Prune.  Bustling and tiny, the food was great.  We both couldn’t refuse hollandaise (on eggs Benedict), but we were disappointed that we couldn’t get bloody mary’s before noon (I guess it’s a NY state liquor board thing — maybe only on Sunday?).  MrsDrFenner pointed out that the liquor board needs to offer a “clarification” that “of course, such laws don’t apply to Mimosas and Bloody Mary’s”.  Until then, do your research.

Our last main stops were Central Park, a hint of shopping (Athleta in person?!?), and a bite to eat before rolling to the Port Authority (Bus Terminal) and heading back to the Valley.  Central Park was a big win.  My first (naive) thought was:  they have rocks here!  That is, rocks big enough to make a 6 year delight in running up, down, over, and around them.  With hidden paths to explore everywhere.  We started in the area called The Ramble and it was a great, strolling treat.  It helped that the rain held off until we were on the subway to the PABT.

PyData NYC Nov. 2013

PyData 2013 NYC was a pretty great time.  It is always fun to meet folks as passionate about your favorite tools as you are.  There’s probably too much to really mention, but I definitely want to throw together a few of my thoughts and ideas.  Without futher ado …

Some of the talks I went to:

  • Travis talking about conda (and blog post and blog post).  While I’m an admitted gentoo fanboy (actually, I don’t fan at all; I just use it), having a lighter weight option for the Python eco-system (across *nix (including OSX) and Windows) is really nice.  If I would have realized a few things about conda last year (I’m not sure how far along it was, at the right time point), I might have used it for some internal code deployment.
  • Yves talking about Performance Python (and an ipython notebook of the same; some other talk material is at his website).  Not much here was new to me — but — being reminded of the fundamentals and low-hanging fruit is always good.
  • Dan Blanchard talking about skll (and a link to the talk).  skll seems to take care of several procedural meta-steps in scikit-learn programs:  train/test/CV splits and model parameter grid searches.
  • Thomas Wiecki talking about pymc3 (most of the talk material shows up in the pymc3 docs; he also mentioned Quantopian’s zipline project and he has a few interesting git repos).
  • Peter Wang’s keynote was insightful, thought provoking, and not the typical painful keynote that has you checking email the whole time.  He mentioned a Jim Gray paper that seems worthwhile.  By reputation, everything Jim Gray did was worthwhile.  [Gray disappeared while sailing a few years back.]

A thought that I’ve had over the years and that I’d love to see come to (ongoing) completion is some sort of CI job (continuous integration) that grabs the main Python learning systems, builds them, and runs [some|many|most|all] of the learning algorithms on synthetic, random, and/or standard (UCI, kaggle, etc.) datasets.  Of course, we would measure resource usage (time/memory) and error rates.  While the time performance is what would really get most people interested (and also cause the most dissent:  you weren’t fair to XYZ), I’m more interested in verifying that random forest in scikit-learn and orange give marginally similar results.  Throwing in some R and matlab options would give some comparison to the outside world, as well.

Doing these comparisons in the right way has a number of difficulties, as I discussed with Jake VanderPlas.  In just a few minutes, we were worried about data format differences (less important for numpy based alternatives, Orange uses its own ExampleTable — which you can convert to/from numpy arrays), default and hard-coded parameters (possibly not being able to compare equivalent models), and social issues.

Gentoo (continued …)

So, I finally decided to try to track down the microphone issues.  As always, this involved descending into the rabbit hole for a little while.  My first goal was to get a (slightly) newer kernel.  I was running gentoo-sources-3.10.7 from a genkernel and I wanted to run gentoo-sources-3.11.6 with a manual configuration.  Some of my hardware (on my MSI Z77A-GD65 LGA 1155 Intel Z77 motherboard) is listed in my sound wrangling write-up.  I also mentioned my irritation at mapping between kernel configuration options (aka, navigating menuconfig), CONFIG_FOO (as listed in .config), and kernel module names (for example, from lspci -k).  Using my hardware, I’ll show a somewhat workable way to get from hardware to a menuconfig option.

Spelunking Hardware to Kernel Config Menu Options

# lspci 
00:00.0 Host bridge: Intel Corporation Xeon E3-1200 v2/3rd Gen Core processor DRAM Controller (rev 09)
00:01.0 PCI bridge: Intel Corporation Xeon E3-1200 v2/3rd Gen Core processor PCI Express Root Port (rev 09)
00:02.0 VGA compatible controller: Intel Corporation Xeon E3-1200 v2/3rd Gen Core processor Graphics Controller (rev 09)
00:14.0 USB controller: Intel Corporation 7 Series/C210 Series Chipset Family USB xHCI Host Controller (rev 04)
00:16.0 Communication controller: Intel Corporation 7 Series/C210 Series Chipset Family MEI Controller #1 (rev 04)
00:19.0 Ethernet controller: Intel Corporation 82579V Gigabit Network Connection (rev 04)
00:1a.0 USB controller: Intel Corporation 7 Series/C210 Series Chipset Family USB Enhanced Host Controller #2 (rev 04)
00:1b.0 Audio device: Intel Corporation 7 Series/C210 Series Chipset Family High Definition Audio Controller (rev 04)
00:1c.0 PCI bridge: Intel Corporation 7 Series/C210 Series Chipset Family PCI Express Root Port 1 (rev c4)
00:1c.6 PCI bridge: Intel Corporation 7 Series/C210 Series Chipset Family PCI Express Root Port 7 (rev c4)
00:1c.7 PCI bridge: Intel Corporation 7 Series/C210 Series Chipset Family PCI Express Root Port 8 (rev c4)
00:1d.0 USB controller: Intel Corporation 7 Series/C210 Series Chipset Family USB Enhanced Host Controller #1 (rev 04)
00:1f.0 ISA bridge: Intel Corporation Z77 Express Chipset LPC Controller (rev 04)
00:1f.2 SATA controller: Intel Corporation 7 Series/C210 Series Chipset Family 6-port SATA Controller [AHCI mode] (rev 04)
00:1f.3 SMBus: Intel Corporation 7 Series/C210 Series Chipset Family SMBus Controller (rev 04)
01:00.0 VGA compatible controller: NVIDIA Corporation GF114 [GeForce GTX 560 Ti] (rev a1)
01:00.1 Audio device: NVIDIA Corporation GF114 HDMI Audio Controller (rev a1)
03:00.0 IDE interface: ASMedia Technology Inc. ASM1061 SATA IDE Controller (rev 01)
04:00.0 FireWire (IEEE 1394): VIA Technologies, Inc. VT6315 Series Firewire Controller (rev 01)

You can also get the kernel modules used by the hardware with lspci -k.  For brevity, I’ll just grab the modules:

# lspci -k | grep use | awk '{ print $5; }' | sort | uniq
ahci
e1000e
ehci-pci
i915
nvidia
pcieport
snd_hda_intel
xhci_hcd

Not so awful.  Now, as I was cruising the web looking for some info, I stumbled across the following helpful command-line to tie “most” module names back to CONFIG_ lines (I lost it, but I found the URL in my browsing history):  grep -R –include=Makefile ‘\bNAME\.o\b’.  For example:

 

# grep -R --include=Makefile '\be1000e\.o\b'
drivers/net/ethernet/intel/e1000e/Makefile:obj-$(CONFIG_E1000E) += e1000e.o

This one isn’t too surprising.  We see that e1000e.o is tied to CONFIG_E1000E.  In turn, we can jump into menuconfig (or xconfig) and search for CONFIG_E1000E (in menuconfig, type a forward-slash “/”; in xconfig, go to the find option).  In either case, we get a most useful piece of information:

Prompt: Intel(R) PRO/1000 PCI-Express Gigabit Ethernet support
Location:
-> Device Drivers
-> Network device support (NETDEVICES [=y])
-> Ethernet driver support (ETHERNET [=y])
-> Intel devices (NET_VENDOR_INTEL [=y])

Now, the only issue is navigating to that spot and enabling the feature as a module or as a built-in.  Not too shabby.  Hopefully, I won’t forget in six or twelve months when I think about building a kernel again.

Two other useful sources of information (note, I plugged in my once-in-a-while USB devices):

# lsusb
Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 003 Device 002: ID 046d:08ad Logitech, Inc. QuickCam Communicate STX
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 001 Device 003: ID 046d:c506 Logitech, Inc. MX700 Cordless Mouse Receiver
Bus 002 Device 005: ID 04a9:220e Canon, Inc. CanoScan N1240U/LiDE 30
Bus 002 Device 006: ID 091e:2459 Garmin International GPSmap 62/78 series
# aplay -l
**** List of PLAYBACK Hardware Devices ****
card 0: PCH [HDA Intel PCH], device 0: ALC898 Analog [ALC898 Analog]
 Subdevices: 1/1
 Subdevice #0: subdevice #0
card 0: PCH [HDA Intel PCH], device 1: ALC898 Digital [ALC898 Digital]
 Subdevices: 1/1
 Subdevice #0: subdevice #0

The camera uses gspca_zc3xx .  Which you can track, using the steps above, to this menu option:

Multimedia support -->
 [*] Cameras/video grabbers support 
 [*] Media USB Adapter
 --> [M] GSPCA based webcams
 --> [M] ZC3XX USB Camera Driver

I started with a kernel seeds 3.6.11 kernel as my base build, and I added a few things.  And removed a few things.  I made a diff, which I need to figure out how to conveniently host and attach to this post.  <<link to diff>>

NVIDIA Wrangling

So, that got me a running kernel.  Next, I realized the reason I hadn’t gone to a 3.6.11 kernel sooner.  NVIDIA’s official drivers (which I need for a dedicated GPU computing card) do not support >=3.6.11 kernels right now.  So, I needed to grab a patch for the nvidia drivers.  Fortunately, one was available.  Grab the tar file that is attached to that link and you can get suitable patches for several versions of the nvidia kernel drivers.  I was expecting to have to (1) hack a nvidia-drivers ebuild and (2) futz with a /usr/local/portage type solution … but, low and behold, all I had to do was copy the patch to (I only wanted to patch one specific version of the drivers; I’m using 325.15):

/etc/portage/patches/x11-drivers/nvidia-drivers-325.15/get_num_physpages_325-331.patch

and emerge -avq nvidia-drivers.  Poof.  Almost easy.  I made some use of emerge @module-rebuild and emerge @x11-module-rebuild and had a working X11 very shortly thereafter.

Last but not least ….

If your memory is good, you might recall that the whole reason I undertook this project was to get my microphone working.  Who knows if the kernel upgrade helped, but with some tweaking I got it working tolerably in Skype.  This was pure, stubborn, knob fiddling.  Here are the final fiddles.  The first image is from kmix (you can do the same in alsamixer, this was easier for my screen grab).  I’m not sure if I mixed-and-matched with Capture 2 before.  Also, I tried different combinations of Rear Mic Boost to eliminate a good bit of static in my system, but with other fixes, this flat moderate level seemed to do fine.  One oddity:  kmix doesn’t show the correct PCM setting (which you can see in alsamixer).  Yes, there is a PCM channel you can select under Setting->Configure Channels.  But it isn’t the right PCM channel.  Odd.

kmix-settings

The second image is from Skype.  I don’t think I had noticed an Analog option before.  What’s more, it had to be the “Alt Analog”.  I don’t know if I tried it or if it was available due to driver differences in the kernel.

skype-settingsAfter all that, my Skype Test Call finally worked.  Yay.  A practical, if not exciting, denouement.