Details
Joined devRant on 11/15/2017
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
-
Ducked Ducked Go
-
Read Steven Lord’s answer here, he explains it pretty well: https://groups.google.com/forum/m/...
-
@HelloUglyWorld this idiom in Python is specifically optimised in the bytecode to push the two values onto the stack, rotate them, and pop them again. I’m pretty sure that’s faster than using a function or three variables.
-
Indeed I mostly did learn it without the assistance of a professor. But others did not, and it is that fact which is the important part. Having to use version control for a group project when there’s already someone (me) in the group who knows how it works just means everyone else makes you commit their changes for them, instead of taking the time to learn.
I’m glad that part of my life is over now. -
If you’d worked in some of the uni project groups I worked in where nobody understood version control, you’d see why it’s essential to teach.
-
People outside CS write code, too, does that mean CS shouldn’t teach how to write code?
-
“In software engineering, a software design pattern is a general, reusable solution to a commonly occurring problem within a given context in software design. It is not a finished design that can be transformed directly into source or machine code. It is a description or template for how to solve a problem that can be used in many different situations. Design patterns are formalized best practices that the programmer can use to solve common problems when designing an application or system.”
It solves the problem of “how do I update my codebase in a stable manner and retain the ability to revert changes if they turn out to break things”, among others. It definitely qualifies as a design pattern. -
It’s not about the specific tool (git, svn, hg), it’s about the concept of version control. That’s what people need to learn.
-
Using version control effectively is itself a design pattern.
-
Yes, let’s teach them a foundational tool that facilitates easily learning those other concepts and being able to effectively work with others to write software. Can you imagine a good software architect that cannot work in a team without breaking the codebase?
-
@Fradow if I had to guess, I’d say the reason most of them don’t know about proper branching and merging is that during university they were never made to do any projects large enough to truly require multiple feature branches. And if they did, it was in a group, where the one person who knew how to use Git would do all the merging and rebasing for everyone.
-
A candidate with a year or two of experience is more attractive than one with a degree, but you still need a degree to get that experience in the first place. And the fact that basically the only way to get a job is to go to college means that college *is* a trade school.
Technically you don’t “need” a degree to work in software, but when you have two candidates, one with a degree and one without, it almost always makes more sense to hire the one with the degree. -
In the real world your coworkers would be fired if they don’t do work. Whereas at university, your team not doing work just means you have to do it all yourself, and then the rest of the team can still lie on the peer evaluation forms and claim you didn’t do any work, so you get a bad mark anyway.
-
I assumed that many reputable universities have an engineering faculty that offers a major in software engineering, as well as a science faculty that offers a major in computer science. Is this not true?
-
@knight
Is it really that hard? Doesn’t seem like it.
list(set().union(a, b, c))
or
list(set.intersection(*[set(i) for i in [a, b, c]]) -
@AndSoWeCode the point is that the only way to design a database that doesn’t violate the closed world assumption is to never allow the use of NULL anywhere within it, because the act of allowing NULL to be in the database violates the assumption.
If I have Anna (170cm tall), Bob (180cm tall), and Chris (175cm tall) in my “person” table, and there is a column for their heights, but Anna’s height is missing, does that mean that Chris is the shortest person in the table? No. If I use LEAST() on their heights, the answer should be unknown, because the database doesn’t know whether Anna is shorter than Chris. -
@AndSoWeCode it is just one particular case, but the functions’ implementations can’t ignore that case, because it is part of how the database is defined. If a value is missing, the function cannot assume that this is because it is inapplicable, it has to allow for the possibility that it is applicable but unknown.
If you store the date of death, a similar problem still arises: the fact that the date of death is missing does not imply that the person is alive. We may just not know when they died, or not know that they have died. -
@AndSoWeCode you said earlier that “null is not unknown, it represents the absence of a value”. Those two things are not inconsistent, since null can represent that a value is either missing or inapplicable. The former is what is being referred to as an unknown value.
I’m not sure what point you’re trying to make by talking about the difference between null and 0, because it doesn’t sound like we disagree on that.
“Computers lack information about lots of stuff. It doesn’t mean that they violate the closed works assumption.” It often does. The closed world assumption is that everything the system does not currently know to be true must therefore be false. This is clearly incorrect when it comes to databases, for example if you have a column called “alive” in a table of people, the fact that one of the rows has a null value instead of a truth value does not necessarily mean the person is dead, it just means you don’t know whether they’re alive or dead. -
@AndSoWeCode I definitely agree about Oracle.
-
@AndSoWeCode the page you linked states that NULL is a state, not a value, and that “SQL Null serves to fulfil the requirement that all true relational database management systems (RDBMS) support a representation of "missing information and inapplicable information".”
The page also explicitly states that NULL violates the closed world assumption: “Another point of conflict concerning Nulls is that they violate the closed world assumption model of relational databases... [and instead] operate under the open world assumption, in which some items stored in the database are considered unknown, making the database's stored knowledge of the world incomplete.” -
@AndSoWeCode
“NULL indicates that the value is unknown. A null value is different from an empty or zero value. No two null values are equal. Comparisons between two null values, or between a null value and any other value, return unknown because the value of each NULL is unknown.”
https://docs.microsoft.com/en-us/...
SELECT LEAST(
IFNULL(a, ~0 >> 1),
IFNULL(b, ~0 >> 1),...
)
I agree that it’s kinda bad, but I think the reason it was changed in a minor version was that the previous implementation of the functions was wrong, because it didn’t match the results you would get in PL/SQL.
Why can’t you just use MIN()? -
Since NULL represents an unknown value, if you’re trying to work out which number is greatest in a set that includes, for example, NULL, 4, 7, and 9... the answer has to be NULL (ie you don’t know) because you can’t be sure that the unknown value is smaller than 9.
-
They may not be planning on discontinuing the iMac just yet but they are deprecating a lot of the features of macOS Server. https://support.apple.com/en-us/...