Hacker Newsnew | past | comments | ask | show | jobs | submit | more parbo's commentslogin

It's never to late to start. If you love it'll work out anyway.

I started "programming" at 10, didn't do anything useful until 16-17. I didn't program for real until I was 25, and it wasn't until I was past 30 that I felt like a proper programmer.


Amen. That's the story of my life too. Nibbled around and played around for a long time, but not really getting into it. But boy, when I did get into it, did I get into it! It's been a lot of fun since.


This is almost exactly my life story, only that I haven't reached my 30s yet.

Reading about all the 18- or 19-year-olds forming million-dollar startups really does make me feel like an underachiever sometimes.


I assume it's patented.



Wrapp does this, but with gift certificates. Multiple friends can chip in to the same gift certificate. https://www.wrapp.com/


The lead developer of SDL is now working at Valve, so I guess that is a very likely solution.


Do you mean Ryan Gordon?



Bring up the Unity dash, go to the apps section (icon with ruler/pen), click "See X more results" in the Installed section. You can filter these results on category. Not perfect, but workable.

To get a second instance of the terminal, right click the launcher icon and select "New terminal". Or middle click the launcher icon. Or press ctrl+alt+t.

How can you say that Unity is still crap, if you haven't tried it in 12.04?


The best part about Ikea is the experience when you return something. No fuss, and it's no biggie if you lost the receipt or come after the return period has ended. Awesome customer treatment.


Independent music stores were killed by iTunes Music Store.


Compiler bugs are always satisfying to find. "It's never the compiler", but sometimes it is: http://llvm.org/bugs/show_bug.cgi?id=12419

But I think my best one was when I found a bug in a third party communication library, that only manifested itself on ARM architecture. Our CPU was PPC, and our simulator was x86. On both of those, unaligned memory reads work fine. But the actual unit we'd talk to had an ARM CPU. On ARM it just reads from the closest(?) aligned address. I found that by looking at network logs and reading the source (which we thankfully had).


Funny - I remember when one of our new Engineers needed a memory-copy method for our ARM embedded solution - he went to Linux source and got some library routine.

It faulted when I used it the 1st time. Fixed the bug (alignment of source), ran again and it faulted again.

So I spent 10 minutes writing a test - move 0-128 bytes from source buffer offset 0-128 to destination buffer offset 0-128. Simple, overkill right?

11 bugs later the damned memory copy thing worked. 11.

The next thing to ask is, What did I learn from that bug? What I learned is, accept NO CODE as bug-free, no matter the source, no matter what authoritative base it came from.

Other learning: why oh why don't CPU designers put a damned memory-copy instruction into the machine? We all need it, all the time, for every project and we all hack something together that works until it doesn't. Sigh.


You can't express memcpy in hardware any more efficiently than you can in C because of the way memory controllers work. It'd end up being microcoded, and ARM can't afford that for the same reason it can't afford unaligned access.

I think x86 does have a microcoded memcpy (rep stos) but efficiency varies.


You mention the memory controller; that's probably where the logic belongs, not on the processor. So the microcode would come down to "ask mc to move; wait for completion"


That's not actually any better speed-wise. And the CPU would still have to microcode the copy because of caches (think of what involvement the memory controller has in doing a cache to cache copy)

The real gain in being able to have the memory controller do a memcpy() independent of the CPU would be to let the CPU operate on data out of its caches in parallel to the memcpy() being executed. But that only helps for a very specific class of memcpy() and is highly system dependent (you have to worry about the expense of keeping caches coherent among other things.) Anyway, an integrated GPU or other additional block of hardware behind the memory controller is a better candidate for this sort of thing than a user-level CPU instruction.

Also, use a better libc.


> why oh why don't CPU designers put a damned memory-copy instruction into the machine?

x86 has had REP MOVSB since forever, complete with a directional flag so you can handle the cases where the source and destination regions overlap. But it went out of favor since for a while from 80386 to early Pentium processors (when Linux was written), REP MOVSx was slower than writing an explicit memcopy loop.

That said, such an instruction would seem to go against RISC philosophies, where you want your operations to be small and atomic and predictable in terms of time and resource consumption.


Right! Foolish programmer! Using REP MOVSB has been broken since about the 2nd issuance of the processor. Dumb folks (read: DOS) used it as a timing loop to calibrate interrupt timers, complained when it got faster and broke their code so Intel 'dumbed it down' til its about the worst way to move memory you could try.

So you say it works again. Cool!

Maybe what we really need is some sort of 'architecture library' that compilers resort to for things like this. Maybe an instruction, maybe a routine, but guaranteed to work for every wrinkle in the architecture.

Because if its not in the compiler, folks will continue to cobble together buggy code of their own, with only a vague idea of the vast architecture landscape they are navigating blindly.


Maybe I'm stupid, but gitster/git doesn't sound like the mainline for git.


It's the account of Junio C Hamano who is the main maintainer of git


and the changes are now also in git/git


You only need one tip: get a kid. Won't help with productivity though.


How about a cat? There is no snooze button on a cat that wants to be fed. :-)


I trained my ex-cat. I'd get up when I damned well felt like it.

My ex-girlfriend at the time, would get up to feed cats.

So he'd bug her on the mornings she was in, and snuggle up with me on the mornings she wasn't.

Same cat learned not to beg at the dinner table for food. Even when sushi was being served. And yes, he loved salmon. Begging resulted in treats -- of wasabi applied to the nose.

He learned after the 2nd time.

TL;DR: with a sufficiently trainable cat, this doesn't work.


Are you comparing the amount of attention cats need with the amount children need (particularly infants & toddlers)? You are so going to have a wakeup call one day :)


I guess I was thinking in terms of having the morning wake-up without the full commitment of a child.

I am quite aware of the differences between a human infant and a cat.


Hmmm.,... two kids, eight cats..... Kids are easier.


What age are the kids?


Now the kids are 24 and 20 :)... But they slept more at night than the cats....


Kids have snooze buttons?


You need to adjust your cat's feeding cycle.


stole my comment, that's exactly what I was going to say...


Ditto. I can't believe this isn't the top reply. Three and a half years post-fatherhood, I don't even remember what it feels like to sleep late. I've gotten better at managing bed times (lights out by 10:30-11 pretty much every night). And I wake up on my own by 6:30 even on mornings where my wife is handling the breakfasts.

All of which is just corroborating the linked post: just do it. Make yourself go to sleep at a reasonable hour. Make yourself wake up, by whatever means you want. You'll find you adapt surprisingly well.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: