Just a couple of quick-at-first-glance issues that need addressing:
- Horrid, and I mean truly horrid, coding style. Please subscribe to the standards. ASAP. They are unofficial, but all major players subscribe to them, as should you.
- Use the fourth constructor argument for
PDO, which allows you to configure the actual connection (see below) - Think about encoding issues:
SET NAMES 'UTF8'. Multi-byte chars are going to be processed as ASCII chars if you don't tell MySQL you're using the UTF-8 charset. - Never trust the network. Don't blindly trust prepared statements, and don't blindly trust user-supplied data, especially when it's coming from
$_GET. The$_SESSIONsuper-global is the closest you can get to the gray area of user data you can trust (because you set it).$_POSTand$_GETare right out.
I've been quite vocal on the fourth argument of the PDO constructor, and on the UTF-8 business, just recently. Instead of copy-pasting my previous answer, here's a link.
Basically, what I'd recommend you do is this:
$db = new PDO(
'mysql:host=127.0.0.1;port=3306;dbname=myDb;charset=UTF8',
$userName,
$pass,
array(
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::ATTR_ORACLE_NULLS => PDO::NULL_NATURAL,
PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_OBJ,
//production only
PDO::ATTR_EMULATE_PREPARES => false,
PDO::MYSQL_ATTR_INIT_COMMAND => "SET NAMES 'UTF8'"
)
);
Why only disable emulate prepares in production code: because debugging is probably something you do on your local machine, or using a VBox + vagrant. Emulating prepared statements is a good way to spot malformed queries without having to burden your DB server. Still, once you go live: use real prepared statements, though.
What do I mean by "never trust the network"? I mean you really ought to sanitize, validate and check the user input you are using in your code. Take snippet 3 for example - I'll apply some critiques I gave both here, and on the linked review
//proper indentation, whitespace is your friend
$searchStmt = $db->prepare(
'SELECT * FROM items WHERE tag LIKE :usersearch'
);
$searchStmt->execute(
[
'usersearch' => "%".$_GET['search']."%"
]
);
Now suppose $_GET['search']'s value was either one of the MySQL wildcards (_ , or % for example). They are perfectly valid strings, so they won't be escaped, resulting in a query not unlike
SELECT * FROM items WHERE tag LIKE '%%%';
//or
SELECT * FROM items WHERE tag LIKE '%_%';
Both of which equate to a slower version of this query:
SELECT * FROM items;
ie: you might end up selecting everything, and are most likely performing a full table scan whenever this query is executed. That's bad.
But a few lines later, you seem to have realized that the value of $_GET['search'] might just be an empty string:
if ($_GET['search'] == "") {
header('Location: http://mywebsite.com/photo');
}
But why perform the query first? Why not check the value, and prevent performing a full table select?
There are a couple of other issues, like foreach + include? Why not pour that code you need a couple of times into a function, include or require that file once, and call that function? I mean: it's less disk access, and less IO means better performance.
You also have the bad habit of re-assigning things without good reason:
$form = $_POST;
$name = $form[ 'name' ];
$desc = $form[ 'desc' ];
$link = $form[ 'link' ];
$tag = $form[ 'tag' ];
$category = $form[ 'category' ];
$sql = "INSERT INTO items ( `name`, `desc`, `link`, `tag`, `category` ) VALUES ( :name, :desc, :link, :tag, :category )";
$query = $db->prepare( $sql );
$query->execute( array( ':name'=>$name, ':desc'=>$desc, ':link'=>$link, ':tag'=>$tag, ':category'=>$category ) );
So you assign $_POST to $form, for every parameter, you create a new variable. Then you create a string and assign that to a variable called $sql, then, from that string, you create a PDOStatement instance, and assign that to yet another variable called $query, on which you call the execute method, constructing a new array using the variables you just extracting from an array?? Doesn't that seem a bit silly? Why not write the code like this:
$stmt = $db->prepare(
'INSERT INTO items ( `name`, `desc`, `link`, `tag`, `category` )
VALUES ( :name, :desc, :link, :tag, :category )'
);
$stmt->execute(
array(
':name' => $_POST['name'],
':desc' => $_POST['desc'],
':link' => $_POST['link'],
':tag' => $_POST['tag'],
':category' => $_POST['category'],
)
);
Not only does that look a hell of a lot tidier, it's actually more efficient, too. But there still is a problem with this code: I'm missing the isset checks. Now I could use a bunch of if's or ternaries here. Depending on what you want. If and which values are missing, you could then redirect, or use null as a default value. But a basic loop could work here, too:
/**
* Wrap the code in a function, so you can re-use it easily
* @param array $data the data to bind
* @param array $keys the keys which contain the values we need
* @return array
*/
function extractBind(array $data, array $keys)
{
$bind = array();//the return array
foreach ($keys as $key)
$bind[':'.$key] = isset($data[$key]) ? $data[$key] : null;
return $bind;
}
//in your case, this would be how you use this function
$stmt = $db->prepare(
'INSERT INTO items ( `name`, `desc`, `link`, `tag`, `category` )
VALUES ( :name, :desc, :link, :tag, :category )'
);
$stmt->execute(
extractBind(
$_POST,
array(
'name',
'desc',
'tag',
'category'
)
)
);
Now I'm not saying you should use this code, but I'm just pointing out ways to make your life easier. You can use the function I wrote here, but you shouldn't pass the array it returns to the $stmt->execute call right away. Remember: never trust the network:
The values in the bind array should still be sanitized and validated. Check my profile on this site, and read through a couple of my "sanitize" and "validate" answers, I've also posted a couple of answers on XSS vulnerabilities, which might interest you, too...