Dave Thomas's Blog

August 2, 2013

Pragmatic Royalties—2013 Edition

Back in 2009, I posted a summary of the breakdown of the royalties we paid on our titles. Susannah, our managing editor, has been suggesting for a while that I should update it. I was surprised by what I found.


These pie charts show the percentage of titles that earn royalties in a particular range. (Click on a chart to see a large version.)






2009


6a00d83451c41c69e20120a60845c6970b-800wi




2013


Royalties







Below $10k, the numbers are about the same. That's not surprising—mostly they reflect the vanity titles that were around in 2009. But above the $10k mark, things are a little different.


In 2009, 70% of titles made more than $25k. By 2013, that number has grown to 74%.


In 2009, 41% of titles made over $50k. By 2013, it's 46%.


And while roughly the same percentage of titles made between $75k and $100k (12% in  2009, 13% now), there's a big increase in the $100–$200k wedge, up from 4% to 12%. The only drop at the top is the >$400k wedge, and that simply reflects that we haven't had a title as big as the Rails and Ruby books recently, while the overall number of titles has grown.


These numbers were pleasantly surprising. I know that as a business we are insulated from the plummeting fortunes of more conventional publishers. But I hadn't realized that were were even more attractive to authors now than back in 2009.


Maybe it's time for you to consider writing a book

 •  0 comments  •  flag
Share on Twitter
Published on August 02, 2013 19:13

July 18, 2012

Premature optimisation in my Rails session code?

For years and years, I've been writing


def current_user
  @current_user ||= (session[:user_id] && User.find_by_id(session[:user_id]))
end 


The idea was to look up and store the current user object in an instance variable the frst time the method was called, and then use the value in that variable for subsequent calls in the same request. 


But now I'm thinking I'll just write this:


def current_user
  session[:user_id] && User.find_by_id(session[:user_id])
end 


Rails already caches query results, so although the find() method will be called on every call to current_user(), the database will only be accessed once, and the user object will only be constructed once. Why complicate my code with optimizations if I haven't yet identified the performance of this method to be an issue?


Is that reckless of me? Or is the ||= a premature optimization which I should train myself out of?


 

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on July 18, 2012 11:21

June 8, 2012

Charity Ruby Workshop at the Scottish Ruby Confererence

I love the idea of charity workshops before conferences. It gives the presenters an opportunity to meet folks, and attendees to get some quality instruction with world-class instructors. And best of all, all the proceeds to go charity.


I'll be giving a one-day Ruby workshop on June 28 in Edinburgh, just before the Scottish Ruby Conference. You don't have to be going to the conference to attend. The suggested donation is £75, but of course you're free to donate more :)


What will we cover? Well, you'll already know Ruby, and you'll want to be digging deeper into the stuff that makes it tick. We'll look at the object model and metaprogramming, and maybe domain specific languages. We'll probably look at some libraries, and reflection, and... 


Why so vague? Because for this kind of event, I'd like to make it revelant to the people who attend. I have a boatload of material, and we'll select what we look at on the day. 


So, if you're in the area on Thursday, June 28, why not register? I'd lov to see you there.

 •  0 comments  •  flag
Share on Twitter
Published on June 08, 2012 12:32

March 26, 2012

Be careful using default_scope and order()

In a recent post, I talked about how the Rails first and limit/offset modifiers to queries are not specified to return a particular result—databases are free to determine the order of rows in a result set if the query doesn't contain an explicit order by clause, so the actual row(s) returned by these methods may not be determined.


Some folks responded (in comments, tweets, and emails) by suggesting that you can easily solve this problem by adding a default_scope call at the top of models that you use first and offset on. Ignoring the fact that this really isn't a solution (both the names first and offset imply an ordering which they don't deliver, and the fix should be in them); ignoring that, the suggestion might lead to another unexpected problem.


Lets start with a simple ActiveRecord model:


class Wish
end


Executing Wish.limit(10).offset(10).to_sql returns the SQL


SELECT  `wishes`.* FROM `wishes`  LIMIT 10 OFFSET 10


Noticing the lack of an order by clause, you add a default scope:


class Wish
  default_scope order(:id)
end


Now when we execute the query, the SQL contains an order by clause:


SELECT  `wishes`.* FROM `wishes` ORDER BY id LIMIT 10 OFFSET 10


Cool. Problem fixed.


However, the user also wants a list of the 3 most recent wishes, so you code


Wish.order("created_at desc").limit(3)


Imagine your chagrin when you look at the SQL that gets run and see


SELECT  `wishes`.* FROM `wishes` ORDER BY id, created_at desc LIMIT 3


Because the id column is unique, it totally determines the ordering—the created_at desc part of the query has no effect.


Once you add a default_scope with an order clause to a model, all subsequent finders on that model will have that order as their primary ordering. If you want some other order, you'll need to remember to add unscoped to the chain:


Wish.unscoped.order("created_at desc").limit(3)


And that's an accident waiting to happen.


 


 


 

 •  0 comments  •  flag
Share on Twitter
Published on March 26, 2012 16:41

March 21, 2012

A subtle potential bug in most Rails applications

The ActiveRecord component in Rails offers a convenient and powerful interface between the set-oriented world of relational databases and the object-oriented world of Ruby programs. However, there's a potential bug lurking in many (if not most) Rails applications due to a subtle implication of the fact that sets, and hence database result sets, and not ordered.


Take a simple ActiveRecord call such as Post.first. Ask Rails developers what this does, and most will say that it returns the first row from the posts table. And, most of the time for small to medium size tables, on most database engines, it does. But thats purely a coincidence, because SQL does not define the order of rows in an SQL result set—database engines are free to return rows in an order that is convenient for them unless an explicit order by clause is used. But the SQL generated by ActiveRecord for this query is select `posts`.* from `posts` limit 1.


When talking about select statements, the Mysql reference says: You may have noticed in the preceding examples that the result rows are displayed in no particular order. The Oracle documentation says Without an order_by_clause, no guarantee exists that the same query executed more than once will retrieve rows in the same order. And PostgreSQL says  If ORDER BY is not given, the rows are returned in whatever order the system finds fastest to produce.


So that innocent select statement is just returning a row at the whim of the database engine. It could be the first. It could be the 42nd. It could be any row. The same applies to queries using limit and offset, often used to paginate results. Call Post.limit(10).offset(10) and ActiveRecord executes select `posts`.* from `posts` limit 10 offset 10. Again, there's no ordering applied, and no guarantee that the same rows will be returned given the same query.


Does this actually affect us? Not often. In fact, probably you're never seen it happen. I have seen the results of a query change when using Oracle. As a table filled, Oracle decided to reorganize an index. As a result, paginating through a set of orders suddenly stopped displaying orders in date order. Adding an explicit order by fixed it.


The moral? Well, first, this isn't a big deal. But, whenever you use finders that assume an ordering in a result set, make sure you make the order explicit—add an order() call to the ARel chain. If you want first() to be compatible with last(), add order("id") to the call to first() (because, somewhat inconsistently, last() currently does add an order by id clause). If you want your paginated result sets to be consistent, make sure you order them (perhaps by id, or by created_at).


 

 •  0 comments  •  flag
Share on Twitter
Published on March 21, 2012 20:06

February 2, 2012

Smart constants

I've been really enjoying James Edward Gray II's Rubies in the Rough articles. Every couple of weeks, he publishes something that is guaranteed to get me thinking about some aspect of coding I hadn't considered before.


His latest article is part I of an exploration of an algorithm for the Hitting Rock Bottom problem posed by  by Gregory Brown & Andrea Singh. At its core, the problem asks you to simulate pouring water into a 2D container, filling it using a simple set of rules.


As I was coding up my solution, I found I had code like



Here " " is a cell containing air, and "~" a watery cell. So clearly we should create some named constants for that:



But it occurred to me that we could use Ruby's singleton methods to give AIR and WATER a little smarts:



Now you could argue that the cave object should do this: cave.watery?, or that the individual elements in the cave should be objects that know their moisture content, rather than simply characters. I don't agree with the first (simply because the cave is the container, and the water/air is the separate stuff that goes into that container). I have a lot of sympathy for the second, and I'd probably end up there given a sufficiently large nudge during a refactoring. 


But, for the problem at hand, simply decorating the two constants with a domain method seems to result in code that is a lot more readable. It isn't a technique I'd used before, so I thought I'd share.


(And remember to check out Rubies in the Rough…)

 •  0 comments  •  flag
Share on Twitter
Published on February 02, 2012 15:30

November 13, 2011

Followup on the EMail Experiment

So the Hacker News folks are having a discussion about my email experiment last April. Many interesting points were raised. One fair question is "how did the experiment work out?" 


In short, it worked incredibly well.


I was mostly worried about annoying the folks sending me email. But the only feedback I got was positive. 


I was also a little worried about folks abusing the [urgent] flag. But that didn't happen, either. I had perhaps 5 or 6 urgent emails, and they were indeed things I needed to handle when I got back. Maybe I'm just lucky when it comes to the people who correspond with me.


The experiement had two positive effects on my life. First, the vacation was genuinely a lot nicer not having to worry about the sacks full of mail piling up for me when I got back. Was that selfish of me? Perhaps a little. 


I wasn't expecting the other side effect. Since I returned from vacation, the quality of email I receive has improved, and the quantity I receive has dropped. I still enjoy interacting with all the people I need to interact with, and I still get to answer all the questions that need answering. It just seems that my inbox is somehow more focussed.


I have a theory. I think that, during the course of the preceding few years, I'd become something of a slave to my email. I'd answer stuff as it arrived. And those rapid responses would in turn trigger another round of email, and another. There was almost an adreneline rush to it.


So my vacation broke that cycle. And now things are sane (or at least closer to sane).


 

 •  0 comments  •  flag
Share on Twitter
Published on November 13, 2011 08:29

April 15, 2011

So I'm trying an email experiment

For the next 2 weeks, here's my vacation message:


 


Subject: I'm on vacation, and I've deleted your message—really


I know this sounds brutal, but here's the deal.


I can't remember the last time I took a vacation where I didn't actually end up working a few hours each day handling e-mail. I felt I had to, because if I didn't, the Inbox would just grow and grow, waiting for me when I got back—the idea that I'd be flying home to 5,000 messages would always be nagging at me, detracting from my holiday. So I worked (which also detracted from the holiday).


 This time, I'm taking a different approach. I'm asking for your help to make my break more enjoyable.


 I'm going to be discarding email I receive. That's right—your email will be recycled into warm, fluffy bit-jackets for underprivileged children. I won't see it.


If it's something you think I really, really need to know, you can bypass this brutality by putting "urgent" in the subject line. But before you do:



if it is something that can be handled by the wonderful Pragprog support folk, could you send your message to support@pragprog.com
if it can wait until I get back on April 25, please resend your message then.


It's an experiment. Bon voyage à moi!


 


Dave


 

 •  0 comments  •  flag
Share on Twitter
Published on April 15, 2011 16:07

November 5, 2009

A business on the side…

Stacked_logo So now I own part of a karate school!



My son Zachary has been studying karate for 5 or 6 years now. He got his black belt, and now he's teaching others while continuing to learn himself.



He used to go to a tremendous studio. What made it so great was a particular teacher, Teresa Paup. She's one of the most natural teachers I've ever met: strong and disciplined, but at the same time very open to the individuals under her care. Which made it all the more upsetting with the studio got into...

 •  1 comment  •  flag
Share on Twitter
Published on November 05, 2009 16:03

October 20, 2009

Pragmatic Bookshelf Royalty Rates

When we started the Bookshelf, we wanted our authors to share in the success of their books, so we set our royalty rates to between 40% and 50%. That is, an author will receive between 40% and 50% of whatever gross profit we make on a book. We felt back then, and we still feel, that this is the highest overall royalty offered by any technical publisher (most publishers offer something in the 10–18% range).

But it isn't always easy to compare different publishers, because the definitions of...

 •  0 comments  •  flag
Share on Twitter
Published on October 20, 2009 14:12